Closed tpendragon closed 2 years ago
Checked in with @kevinreiss and @escowles. We wrote to Barbara V. and Jen Hunter to get approval to place harmful content statement page as a child of About page at https://library.princeton.edu/about/harmfulcontent. We are awaiting approval.
update. Got approval for page and placement of it. @kevinreiss also had a look and helped with the dropdown placement. the URL is: https://library.princeton.edu/statement-harmful-content
Taking Block label off of this ticket
The rights and permissions link has a sort of alert icon (which I'm honestly not sure is clearly affiliated with the link given there's a row of icons before it, but it is there). Do we want an icon for this? If so, what icon? If not, how can we set it apart from the rights and permissions link?
Maybe we should remove the Rights and Permissions icon and put in some shading to make these look like buttons?
Here's the PR that added it: https://github.com/pulibrary/figgy/pull/4578
I do feel like there are a number of ways a user might want to report an object — harmful content, metadata (offensive language, factual error, etc.), rights/privacy, etc. I wonder if we should have a single alert-style button that brings up a brief page listing the options and going to the harmful language/content pages, takedown page, etc. would be better?
So just have the alert icon without any link text? That does seem like it would fit better with what we've done otherwise.
I kind of like the idea of one button, when clicked, giving you options for what you want to alert folks to. It would be cleaner and I think would serve the same purpose. @sdellis , what are your thoughts?
@kelea99 my first thought on this is that the Statement on Harmful Content page does not allow users to report harmful content. It just says we are working on a form that will arrive at some point in the future. My thoughts are that this ticket will be blocked until we can get the appropriate forms in place for reporting.
As for the single button, our recent usability testing with Finding Aids was that it was unintuitive for folks to use a single button for both "suggesting a correction" and "reporting harmful language". If the label were generic enough (i.e., "report a problem"), then it might work to have a dropdown of options. I would advocate for an "Other" option for anything we haven't listed.
One additional issue is that we do not always know where a user is seeing the problem. How will we provide enough contextual information if we are sending the user from the viewer off to another site to report the problem? For example, it could be in a paragraph on page 325 of an ACLU manuscript in an unknown component and unless the user can convey all the correct information manually, we may have trouble tracking it down. We also don't know where to forward the report without the contextual information that tells us who is responsible for taking action, so it would become someone's task to locate the problem and then route the correspondence accordingly.
Finally, people who are reporting harmful content and offensive language may want to do so anonymously as they may fear repercussions. We should have this conversation as opening up such a channel could subject those who manage the queue of complaints to harmful content as an anonymous form could be easily abused.
Our colleagues have worked hard on this statement and recommendation. What can we do to ensure we respect that work and implement the output of their time and research?
Full recommendation from that working group is here: https://lib-confluence.princeton.edu/display/COM/Statement+on+Harmful+Content
My reading of this recommendation is that part of the goal is to show that we're taking harmful content seriously as a profession and masking it behind extra buttons or pages will bury its importance. With that in mind, I'd like to repeat the options I've heard so far with some inline questions:
I don't know if this is feasible or practical, but one idea I had was to make the alert icon open up a menu or otherwise expand to show a few sentences that would briefly explain the options for reporting items or engaging with us., including links to the takedown form, and other forms as they are ready to link to. Doing this in the viewer instead of a separate page would address the concern that @sdellis raised above about losing the referrer URL (which I agree is essential in those forms to make the feedback actionable). In theory, I could imagine adding even more info to the URL (such as which page the viewer is displaying) to make this even more granular than it is now.
If it helps, I was on this group and can give my two cents:
I may have misunderstood. This ticket is not about implementing a way to report harmful content or description, it's about making people aware of our statements and policies. Just to be clear, the Statement on Harmful Content provides no information on who to contact or how. I think if we care about this issue (which we all do) we need to provide a good user experience for actually reporting harmful language. A bad experience or dead end is going to make it look like we threw it together to check a DEI box without actually investing in the goal.
Also, if it's placed on the viewer, where do we put the content warning for content that is not digitized? Or are we only warning users about digital content?
FWIW, the icons I've seen for reporting problems are typically "word bubbles with an exclamation mark" or a flag (for flagging problems).
I like the idea of a word bubble with exclamation mark icon that triggers a pulldown menu with each of these links.
We can implement this similar to the Rights button.
This is blocked until there's a public page to link to. Presumably @kevinreiss or @escowles can tell us when that page is up
Relevant recommendations from the harmful content working group:
Sudden Priority Justification
The output of this group is in direct support of the library's mission, vision, and north star statements. We should ensure we get it in place as soon as possible in an effort to support that effort and ensure it's respected.