Chocobozzz / PeerTube

ActivityPub-federated video streaming platform using P2P directly in your web browser
https://joinpeertube.org/
GNU Affero General Public License v3.0
13.07k stars 1.51k forks source link

How to deal with undesirable instances, a.k.a. "Nazis Problem" #430

Closed danielquinn closed 4 years ago

danielquinn commented 6 years ago

I'm curious as to whether PeerTube has a plan to combat the so-called "Nazi Problem". That is, how do you prevent everyone's instances from being linked to (and thereby promoting) content they wouldn't want to be associated with? Even if it's stuff you might not personally disagree with, streaming content that you don't control from a site in jurisdictions where that content is illegal can get you in trouble.

If there is no plan, I would suggest some sort of community tagging feature with a fixed set of software-imposed tags would go a long way. This would allow someone coming across an ISIS promotional video to flag it as terrorism, and then let the network block these videos based on the tagging score. Similar tags could be created for fascism, hate, violence, porn, etc.

I have some Javascript experience, but it's all client-side. I have a lot of Python and SQL in me though, if that's of any value.

Chocobozzz commented 6 years ago

Hi,

rigelk commented 6 years ago

I guess this is related to #406 - as of now instances can freely register on http://instances.peertu.be/ and if we let instances follow it automatically with no prior filtering/tagging, that could just lead to undesirable consequence like this.

danielquinn commented 6 years ago

@Chocobozzz it's nice to see that some forethought on this issue is already in place, but what I'm reading there is a lot of manual overhead on what, in a scaling platform, would be an untenable workload for a human to maintain. Also, nsfw is really broad and ignores cultural differences between countries that might be more puritanical than others. For example, if someone wants to run a channel for sex education, and another person has one for straight-up porn, while another is all about hating women, it's likely the first two would get tagged nsfw while the latter would come through just fine.

As for @rigelk's point, honestly, I rather like the idea that instances get automatically followed (strengthens interconnectivity). It's just a question of how to programmatically determine the rules for that auto-follow. If there's a way to determine a set of "follow criteria" like "must not have a violence score greater than n, this would allow community members to decide what's share-worthy.

Chocobozzz commented 6 years ago

Tagging an instance is a very good idea yep. You talked about community tagging, but how do you imagine it? How to prevent abuses (example: a raid of a community that don't like videos of a specific instance)?

danielquinn commented 6 years ago

Yeah that's a tough one. One answer might be considering concepts like community reputation: if a video is brigaded with terrorism tags, but the channel has a history of videos not tagged in that way, the tags themselves become suspect.

You'd maybe also want to have some sort of recourse for bad tagging and consequence for applying a bad tag. So say for example video A is tagged with child-porn by users X, Y and Z. User X has a reputation score of 25 as they've been using the project a long time and had many of their tags confirmed, while users Y and Z have much lower reputations. X's tags would therefore count for more.

At the same time, if X him/herself has a lot of videos tagged with a high child-porn (or other undesireable?) score, then maybe that should count against the value of their tags.

For me, the answer is usually to take skills we already use IRL (like reputation) and apply them to community systems. Basically, if someone with a reputation for being shitty says that someone with a reputation being not-shitty is being shitty, their opinion shouldn't count for much.

rigelk commented 6 years ago

Maybe we're making this overly complex. How are others doing it already in the fediverse (if appliable, that is)?

voronoipotato commented 6 years ago

Right now it's being done organically on Mastodon/Pleroma/GNUSocial. If some instance is not responding reports from your instance then you block them.

I think NSFW is fine, it merely states that it is not safe for your work. Usually people of similar culture cluster on instances to the term gets a bit of a domain specific meaning but there's a bit of a universal agreed upon "NSFW" that has developed somewhat organically. It really hasn't been an issue, when in doubt you tag it and people will click through when they get home. Mastodon also supports CW: tags of any form. It's just a text field and you can type whatever you want. It's very helpful when someone has a concern that you might not have thought of like epilepsy.

Nutomic commented 6 years ago

This issue is getting pretty relevant. I received reports for 3 instances that are hosting copyright protected content. I deleted them from Manage Follows -> Following, but videos and profiles are still accessible via my instance if someone knows the direct link. So at the very least, deleting an instance from following should also delete all content from that instance.

I can provide example links via email or mastodon to developers or instance admins.

ghost commented 6 years ago

I would suggest some sort of community tagging feature with a fixed set of software-imposed tags

and then with this #784 feature - to block/hide all the videos that have certain tags. #784 feature could be more useful as an extension, now that I'm thinking about all this.

thombles commented 5 years ago

I would like to have a much more restrictive option for my instance—channel pages, videos and subscriptions should only work for local content, plus any domains that have been explicitly followed at a server level.

The existing controls mean that a user who arrives at the homepage won't find porn by browsing around, which is good. However it's a problem if someone can come up with a URL on my instance that shows content from an arbitrary server elsewhere on the fediverse. I understand this is by design and changing it would "break federation" to some degree, but at the moment it's a dealbreaker for making a server that only contains safe content.

Would this sort of toggle work? I could have a look at implementing it if we feel this is a practical approach.

Chocobozzz commented 5 years ago

However it's a problem if someone can come up with a URL on my instance that shows content from an arbitrary server elsewhere on the fediverse.

You should be able to prohibit it: https://github.com/Chocobozzz/PeerTube/blob/develop/config/production.yaml.example#L63

thombles commented 5 years ago

Aha, I missed that config option. Thanks very much!

So on this issue, that's one way to resolve it but I assume many people will be looking for an inbetween option...

styromaniac commented 5 years ago

ZeroNet and IPFS have opt-in blocklists users can seed and toggle on. I maintain one for ZeroNet. With ZeroNet, hashing the addresses is optional. With IPFS, it's not.

If anyone's concerned about a blocklist becoming deceptive or too opaque, just remember that hashes are reproducible. Titles can be provided without a means to visit the sites or content. My blocklist now goes by hashes of site addresses.

People are more safely browsing ZeroNet because of this feature. 🥳

http://127.0.0.1:43110/Styromaniac.blocklist.bit/

I really think this is the best approach.

ghost commented 5 years ago

Similar to @styromaniac's approach, publishing the followed and blocked lists using ActivityPub (maybe that's already the case) and using that could be an option.

Options:

and the reverse. This way two extra activitypub streams (following, blocked) would be added and one admin page.

I don't know the format of ActivityPub messages, but it could be a list of objects containing the instance URL and the follow/block origin. This would mitigate the scenario of large instances flexing their muscles and blocking competitors or undesirable instances with everybody else following along. Possible scenario:

alxlg commented 4 years ago

You shouldn't. A principle of democracy is separation of power. Each country has its laws and judiciary. Appropriating the juridical power is against democracy. A group of people thinking they can judge and condemn others' thought and infringe the sepatation of power is exactly what Nazism did. So just let Parliaments decide what is legal and what is not and let courts judge and condemn single episodes. If you want a certain type of content to be illegal in your country follow the rules of the democracy and make your Parliament vote for it instead of lobbying online development communities.

Lonsfor commented 4 years ago

if you spend any amount of time on the internet you would know that "just leave it to the courts 4Head" doesn't work. also your larpy sermon about democracy is cute and all but democracy is not a permanent thing. its a thing that needs to be actively maintained, and fascist are a real treat to democracy.

A group of people thinking they can judge and condemn others' thought and infringe the sepatation of power is exactly what Nazism did

imagine using "you are the real nazis" unironically in 2020

alxlg commented 4 years ago

@Lonsfor any country in the world has real democracy, we all live in a neo-feudalism of corporations.

Just leave moderation to each instance, that was the point of decentralization. If an instance doesn't accept your content another could or you can host your own instance. If there will be a form of centralized discrimination it will kill this project.

Also I'm from the country of the fascist regime, no other country had fascism. What you call fascism is actually authoritarianism but if you insist in using the term fascism you will fail to see other forms of authoritarianism, namely neoliberism, so called political correctness, scientism etc.

And yes this proposal by PeerTube project is authoritarian and against the principle of decentralization of power.

Lonsfor commented 4 years ago

Also I'm from the country of the fascist regime

are you trying to saying that you are from Germany? which case i don't believe you nor do i care.

no other country had fascism.

factually wrong read a book

What you call fascism is actually authoritarianism but if you insist in using the term fascism you will fail to see other forms of authoritarianism

???? i only used fascism one time and in no way was descriptive, you have no idea what i am thinking.

namely neoliberism

neoliberalims is bad but it is not authoritarianism. lest not be reductive here.

so called political correctness, scientism etc.

oh no you can's say the n-word? oh you must be soooo oppressed! not being able to oppress others, the pinnacle of authoritarianism lmao. also wtf is "scientism" what are you an anti-intellectual?

ghost commented 4 years ago

@alxlg said it right:

Just leave moderation to each instance, that was the point of decentralization. If an instance doesn't accept your content another could or you can host your own instance. If there will be a form of centralized discrimination it will kill this project.

I think instance admins have all the necessary tools to ignore/block any kind of content, if some tools are missing, then a specific issue with that request should be opened. I think this issue should be closed.

Nutomic commented 4 years ago

@k09i71 You mean like all these issues that have been ignored for almost two years?

https://github.com/Chocobozzz/PeerTube/labels/Component%3A%20Moderation%20%3Agodmode%3A

ghost commented 4 years ago

There are only 2 developers (and a few contribs), not 200. I believe they're going for the most important issues first, and then all the others. Please let's be patient :+1:

onlyjob commented 4 years ago

I just want to say that IMHO blocking any instance is wrong and dangerous for foundation principles of the project.
Would you block all incoming emails from Gmail just because one "bad" user sends nasty emails from his Gmail account?
It is always a case-to-case problem with behaviour of a particular user rather than a node.
While having (ab)users' black list may be necessary, blocking IP addresses of attackers who might be attempting any form of attack is better done by other means such as firewalls, fail2ban, etc.

Chocobozzz commented 4 years ago

@Nutomic Or maybe these issues that have been implemented for almost two years? Or other tools implemented where issues were not created? Or issues you created and we implemented? I've had enough of your derogatory comments. We are creating a free software, and you're not our client. If you don't like our priorities, you're free to implement issues yourself or pay another developer to fix the issues you want.

Closing this discussion that becomes irrelevant.

Niquarl commented 4 years ago

There are only 2 developers (and a few contribs), not 200. I believe they're going for the most important issues first, and then all the others. Please let's be patient 👍

I think this is the problem. What is considered as « the most important issues » is a very subjective view. Right now it is clear that moderation has never been considered one and that is a big mistake. Perhaps the lead dev team doesn't understand the mounting problems with moderation that makes hosting and moderating an instance right now with open registrations or just comments.

There is no way (at least that I know) that allows for a moderator to see a history of comments made by a local user. It is also unclear what blacklisting or muting actually means sometimes. This isn't just a problem with code but also a docs problem.

There has been, it seems, some progress on taking moderation seriously but considering issues that deal with nazis irrelevant is frankly shocking ( previous comment, not yours @k09i71 ). The fact that Peertube Isolation was needed should be wake-up call. One month dedicated to improving moderation seems quite short, given that Peertube is already over two years old! Other fediverse projects had a better implemenation of all this way before, despite the constant opening of "issues" to improve.

If you don't like our priorities, you're free to implement issues yourself or pay another developer to fix the issues you want.

Perhaps we should if the payed dev team is not good enough. Are there any developers willing to start a fork then ? It's a shame that is needed though, not sure it's a great mentality to have.

alxlg commented 4 years ago

@Lonsfor only Italy had fascism, Germany had nazism, the others had other ideologies and other regimes.

The unbiased version of this ticket would had been "dealing with federated illegal content", instead we saw the umpteenth parade of kids egged by regime propaganda aka mass media.

I hope any futher moral persuasion attempt will fail since it's clear today that so called politically correct arguments hide the evil desire of delegitimize others' ideas.

We saw this so many times in these years that we can clearly see the pattern: someone will publish a video claiming national sovereignty is important to guarantee people sovereignty and will be labeled as "nationalist" and "fascist", someone will oppose to mass immigration as part of the capitalistic globalization and he will be labeled as "xenophobe" and "racist" and so on with any possible important social issue.

The tools required in this ticket are not for moderating illegal content, but instruments to put in place the censorship described above.

rigelk commented 4 years ago

@Niquarl moderation tools don't just appear overnight, and as far as I am concerned it's been months that I took the issue of moderation tools. There are varied concepts (accounts, channels, videos, comments, instances) which are all the more scenarios to deal with, and multiply the implementations that are required to moderate them.

The fact that Peertube Isolation was needed should be wake-up call

Projects like this help like-minded communities to organise, and no tool will effectively replace that.

considering issues that deal with nazis irrelevant is frankly shocking

Please don't change words.

The issue is relevant. The discussion is not anymore, as said in the comment you seem to take issue with. The last comments deviated from discussing practical means of tagging content/instances that might help dealing with such content, to discussing the project's general stance on moderation, or even discussing what is/isn't nazi.

The issue was already very unspecific (hence the discussion tag), but now is unhelpful at best. Locking, until a similar, more practical issue opens.