Closed soledadli closed 1 year ago
Really great prompts for the discussion. Susanna and @JamesSCTJ may have some thoughts that they've aleady put into this.
Here are some of my thoughts with justifications to get you started (I'm sorry I can't be at the meet up as I have a full day meeting for a different project!)
How many moderators’ approval need to be achieved for publishing experiences? 1
Can moderators see other moderators' decisions (approval, disapproval) or do they remain anonymous in the platform? Moderators remain anonymous to each other, but moderation decisions are visible. Admins can see each moderator's decisions and can send them a private email if there is a problem
Can moderators message each other on the platform? No, they will not have that option on the platform. There may be a separate Slack channel for moderators to talk to each other outside of the platform.
If users‘ experiences are disapproved, can they ask for a new pair of moderators for a new decision? No, this won't be necessary.
If new submitted experiences get disapproved again, does it count as the second warning or still count as one? We will hold off on the warnings for now
If experiences are disapproved, is it mandatory for moderators to comment on why they are disapproved or shall moderators give reasons out once users ask for them? Moderators will give a comment in a text box with an explanation telling you where the issue is/where to correct and referring to the code of conduct
What are the selection criteria for selecting new moderators? Moderators will have to be familiar with the code of conduct and must be able to apply them (via. practice cases); we will aim to have a significant number of autistic moderators
Thanks for all the inputs and discussion! I will close this issue now.
@soledadli - this issue should close when the content has been captured in the github repository - I have reopened. Please can you follow up with notes in a pull request?
@KirstieJane Got it. I will capture the notes in the Github repository and later close the issue then.
I have the following questions about reporting the harmful reports:
Meet up from November 30, 2021:
Harmful reports will be sent to moderators. Administrators will have the access to view all the harmful reports, but will not receive the notifications per experience.
Moderators will select experiences on their own for moderation. Once experiences are selected, they should moderate within 24 hours. Otherwise, the experiences will be put back into the waiting pool.
Email should be avoided in the platform as it is not user-friendly and can be quite hard for document the process.
Let me know if I miss anything from the meet-up discussion today. @GeorgiaHCA @skfantoni @badgermind @jhlink @anoura12
I have the following questions about reporting the harmful reports:
- Who can submit harmful reports? Every viewer or registered user?
What is the mechanism for the harmful report?
- Can viewers/registered users label triggering content for published experiences on their own if they think moderators may miss some points?
- Do they need to file a report and let moderators decide how to re-label the triggering contents?
- (We discussed in the meet-up last time that even though harmful reports are made, experiences will remain in the platform. )
We will hold off the "flag for harmful reports" feature for MVP as it will complicate the current infrastructure. Apart from that, further clarity will be needed for the "harmful reports" process.
The role of "administrators/reviewers" need to be further discussed, as the "flag for help" feature will add workload for the development side.
I am wondering whether 24-hours may be too short for moderation. Shall we provide an option to extend moderation time or generally extend it to a longer time? @GeorgiaHCA @skfantoni @badgermind
I am wondering whether 24-hours may be too short for moderation. Shall we provide an option to extend moderation time or generally extend it to a longer time? @GeorgiaHCA @skfantoni @badgermind
- SL: A feature to extend the time may complicate the infrastructure. I am wondering whether we can extend the time.
For the time being, let's try 24 hours. If moderators provide feedback that it's too short, then we can increase the time.
I am wondering whether 24-hours may be too short for moderation. Shall we provide an option to extend moderation time or generally extend it to a longer time? @GeorgiaHCA @skfantoni @badgermind
- SL: A feature to extend the time may complicate the infrastructure. I am wondering whether we can extend the time.
For the time being, let's try 24 hours. If moderators provide feedback that it's too short, then we can increase the time.
@jhlink Got it! I will keep the 24 hours limit then.
Thanks James @jhlink for providing such detailed feedback on moderation workflow on Slack. Here is the documentation of the feedback:
Published Experiences
write harmful reports
. It’s not clear to me what harmful reports
entail or what function they’re supposed to fill when moderating reported experiences.New Experiences SQL
is a little misleading. I think it’s clearer to say Pending Pool of New Experiences
or a collection of experiences pending review
. Submitted experiences are submitted and stored already on the SQL database. Moderators selecting or unselecting them only modifies the experience in that it’s being reviewed by a moderator.
ambiguous
experiences rather than new
experiences? I think the name of the process should be more specific. mailto/mailfrom
options ?@jhlink Hi James. Thanks again for the wonderful feedback! 🎉 Let me know whether the updated diagram and notes here clear out your confusion.
Published Experiences
write harmful reports
. It’s not clear to me what harmful reports
entail or what function they’re supposed to fill when moderating reported experiences.
write harmful reports part
. I updated the diagram. You can find it above. New Experiences SQL
is a little misleading. I think it’s clearer to say Pending Pool of New Experiences
or a collection of experiences pending review
. Submitted experiences are submitted and stored already on the SQL database. Moderators selecting or unselecting them only modifies the experience in that it’s being reviewed by a moderator.
Pending Pool of New Experiences
ambiguous
experiences rather than new
experiences? I think the name of the process should be more specific.
Ambiguous
experience looks much clearer than unsure
. I updated the name in the diagram. ambiguous
experiences in the moderating new experiences loop as the ambiguous
are new
experiences at first. As moderators may not be sure about their decisions, these new experiences
are flagged as ambiguous
instead. ambiguous experiences
also appear in moderating disapproved experiences
as well. mailto/mailfrom
options ?
Summary
This issue aims at:
Please comment on this issue all questions you think will need to be discussed for the moderation workflow to be created successfully.
Details
Questions (Discussed & Solved)
Moderation Workflow
Updated Version (Dec 13th, 2021)
Original Version
Related Issues
Suggestions for tasks that can fix this issue
Who can help
Anyone interested in moderation.