Closed JasonCarswell closed 5 years ago
- Privately and publicly notate and share content about the Wikipedia article without changing the article.
Private / public isn't the responsibility of this front-end; whatever's serving the content can implement this. The "notate" part is implemented; "share" will be done when it's got the ability to request data from somewhere.
- Inject clearly differentiated content into the article pages, like an overlay of sorts. Whether this affects the CSS code or is some kind of semi-transparent layer is up to the developers.
Please clarify.
- Highlight existing article content.
Tricky, but I think I can do it. TODO: Make a separate request. (Either you or I.)
- Distinguish who's notating what.
I'm not a UI expert; how would this be done?
- Comment on other's notes and comments.
Responsibility of the back-end, but I need some way of displaying this. Shouldn't be too hard.
- Classify, warn, and/or block other's notes and comments, and ways to count them. This will help to identify infiltrators, trolls, SJWs, alt-right, truthers, etc etc etc and folks may pick their tribes as they see fit. These user/comment points/classifications should be voted upon here in development and limited in number so it doesn't become too chaotic. Other meta-tag classifications may remain unlimited in number, yet be cumulatively voted upon so that a clearer idea of how the content is viewed may be tallied.
I'm not getting into this; sorry. Also, that's the responsibility of the back-end. I'll have to provide a way of displaying a "vote / rating / tagging" of posts, though, so it's possible to implement in the back-end.
- The "voting" system might benefit from applying categorization techniques from WikiData https://www.wikidata.org/wiki/Wikidata:Main_Page which is presently a chaotic mess, but could one day be promising for A.I., assuming the political MSM biases were filtered out, perhaps with the aid of WikiNotate.
Again, back-end and far future.
- The classification/voting system must be very expandable. Who knows what the future might want.
TODO: Separate request.
- Other ways to prevent infiltrator influences from drowning out other perspectives' free speech and representation, no matter how "fringe" or taboo.
Back-end.
- Prevention of bot tampering.
Back-end. This is literally just a display system at the moment.
- Display modes: short conventional Wikipedia article extend article with excessive details highlight "alternative perspectives" / "taboo" / "fringe" / "fancruft" content display options for notes, classifications, voting, etc etc etc
I'll add a generic filtering system and let the back-end decide. I'd like this to be useful for more than truthering.
I'm sure I'll come up with more.
Keep them coming!
If the default Wikipedia page is white, then the "injected" content might be in highlighted boxes, ie. yellow for open/public/editable notes, green for private/shared notes, blue for private/personal notes, etc. This "highlighting" in colour might also apply to the citation numbers.
I made this comment before you'd really liked the idea of the "injected content" coming after the targeted paragraph. Forget about the semi-transparent overlay stuff.
A WikiNotate history page may be in order, much like the regular wiki history. This could apply to all the open/public/editable notes. This would mean less "distinguishing" to show, but more to research into.
Alternatively, for each section of injected content perhaps there'd be a little tripple-dash menu icon in the corner to raise more information.
Perhaps, at the end of the injected content or in the menu the identities could follow 2 common formats: 1) On wiki talk pages one can sign "~~~~" which will leave your name linked to your profile and talk page, a thank you, and the date. 2) An avatar image icon and your name and a date - the icon and name would link to your profile and the date might be a permanent link to the changes.
"Votes", "ratings", and "(meta-)tagging" are different things. I would want them all. The more ways to do this the better. Naturally some may be abandoned for whatever reasons while other new classifications may make themselves needed. Just try not to paint yourself into a corner. For example, SaidIt has only 2 voting options because the Reddit coders didn't imagine having more options. Limited. Also, SaidIt has no emoticons, images, etc. in comments. Limited. Also critical is nested classifications, or folders. For example: entertainment > sports > football > helmets. Also critical is multiple tagging. For example: sports = entertainment AND activity ; helmets = football, hockey, costumes, battle gear, head gear.
I agree. There's more banned on Wikipedia than just truthers. "Fringe" covers truthers as well as anything else not popular enough to have a lot covered in MSM about it - at least when "they" insist on it. Besides "fringe" there's another thing that's censored and that is "fancruft" which they might consider obsessive details about something. They will have every episode of Star Trek documented but not for other shows and call that fancruft but not the Star Trek. It's sooo arbitrary. That's where things like Wookiepedia comes in.
Display modes are critical. This will help distinguish the tone of things, whether from the left-right, or atheist-religious, or alternative theories, histories, underdog perspectives, taboos, NSFW, childsafe, etc. Too often there are many valid (or even invalid) views on things and they are not allowed.
I'm guessing there are other things that get censored on Wikipedia too but I haven't run into any that I recall.
I'm not exactly sure how this can fit in with the current Wikipedia system, but there are several things this app seems to need.
Privately and publicly notate and share content about the Wikipedia article without changing the article.
Inject clearly differentiated content into the article pages, like an overlay of sorts. Whether this affects the CSS code or is some kind of semi-transparent layer is up to the developers.
Highlight existing article content.
Distinguish who's notating what.
Comment on other's notes and comments.
Classify, warn, and/or block other's notes and comments, and ways to count them. This will help to identify infiltrators, trolls, SJWs, alt-right, truthers, etc etc etc and folks may pick their tribes as they see fit. These user/comment points/classifications should be voted upon here in development and limited in number so it doesn't become too chaotic. Other meta-tag classifications may remain unlimited in number, yet be cumulatively voted upon so that a clearer idea of how the content is viewed may be tallied.
The "voting" system might benefit from applying categorization techniques from WikiData https://www.wikidata.org/wiki/Wikidata:Main_Page which is presently a chaotic mess, but could one day be promising for A.I., assuming the political MSM biases were filtered out, perhaps with the aid of WikiNotate.
The classification/voting system must be very expandable. Who knows what the future might want.
Other ways to prevent infiltrator influences from drowning out other perspectives' free speech and representation, no matter how "fringe" or taboo.
Prevention of bot tampering.
Display modes: short conventional Wikipedia article extend article with excessive details highlight "alternative perspectives" / "taboo" / "fringe" / "fancruft" content display options for notes, classifications, voting, etc etc etc
I'm sure I'll come up with more.