Open npdoty opened 5 years ago
As a technical matter, I think this would need:
And governance/people-wise:
I think the UYA approach is very interesting enough to try, though in certain circumstances I could see it interpreted as condescending.
No objections to the technical matters.
With respect to documented rules: Transparency in communicating decisions/recommendations seems to be practically useful in increasing compliance (https://shagunjhaver.com/files/research/jhaver-2019-transparency.pdf) and I think it would generally increase trust in the system. Having said that, balancing that transparency with the privacy of reports likely requires clear rules ahead of time.
With respect to volunteers I think it would be useful to explicitly prioritize recruiting a diverse group. For one thing this would make it slightly more likely that the volunteers are representative of the spectrum of perspectives in the community. Drawing from a broader pool would likely also decrease the number of hops between each community member and a volunteer which would hopefully increase trust in the mediation team.
What if we had an ombuds/mediation team rather than an moderator/banning team? Could we have volunteers that would send messages asking a user to stop sending unwanted messages to another user? It could be like the "UYA" notices used at MIT in the 1990s, which were said to be very effective. When harassment from an account was reported, an admin would email the harassing account saying "someone using your account did X; your account may have been hacked and you should probably change your password" and reportedly it was remarkably effective at getting people to take the face-saving out and stop the behavior.
https://www.metafilter.com/173881/UYA-notices-and-face-saving-in-moderation
/cc @scrivener