peer / mind

Decide together.
http://peermind.org
Other
57 stars 12 forks source link

Color-Coding the Relevant Confidence Value #181

Open mkanwal opened 7 years ago

mkanwal commented 7 years ago

In the case of a supermajority vote, the most likely outcome is not the same as the sign of the average vote. For supermajority items, the displayed confidence should correspond to the more likely outcome. It would be good to color-code the confidence with the result it corresponds to: red for failing, green for passing.

For example (from a real motion): Votes: 49 Abstentions: 1 Confidence: 0.08 Result: 0.02

Because the average is positive, the confidence is computed for Pr(motion achieves supermajority), but this is extremely unlikely (8% chance). In actuality, this motion has achieved quorum (90% confidence) for the alternative: Pr(motion achieves 1/3 against), hence it is important to clearly display this outcome rather than confusing voters that no outcome has been reached.

The solution would be:

outcome = sign(average)
confidence = compute_confidence()
if (confidence < 0.5)
  outcome = not outcome
  confidence = 1 - confidence

# quorum For
if outcome == True AND confidence > 0.9:
   print confidence in green
# quorum Against
elif outcome == False AND confidence > 0.9:
   print confidence in red
# no outcome
else:
   print confidence in grey
mitar commented 7 years ago

I slightly disagree that we should do the proposed change because it confuses users. Currently the idea somehow is that if average is above 0 and confidence is above 0.90, it passes. But if we show a different confidence (1-confidence) above (even if a difference color), one for failing, it would be 0.02 average and 0.92 confidence and it would look as a thing will pass (because average is above 0.02).

Moreover, we allow votes to be changed, so existing voters could change vote to get the confidence higher.

I think we should find some other way to convey this. Maybe once drawing an interval on graphs is made it will be easier to see this?