Ether1Project / ECPs

ETHO Community Proposals (ECP)
https://ecp.ether1.org
MIT License
2 stars 10 forks source link

Community Standards For Objectionable Content & Kill Swtich #6

Closed hashratez closed 3 years ago

hashratez commented 4 years ago

Ether1 allows for immutable data to be hosted in the blockchain in total anonymity. This is a noble and just cause in the era of centralized control from both governments and corporations. The definition of free speech and expression can vary greatly and it is almost impossible to define, nor should it be defined.

However, with that said, Ether1 cannot be without limits. Community Standards must be created and enforced for objectional content such as Child Pornography. It is simply indefensible as any right to free speech or expression.

If Ether1 does not create and enforce Community Standards it will quickly become a dumping ground for the worst possible content and people on the Internet which would destroy any value or usefulness for legitimate commerce.

The example of Child Pornography is every simple--everyone gets it. The bigger problem is what comes next? How is it dealt with? Whom decides? Where does it end? These are not easy questions to answer. Even the world's largest tech company--Google has issues with acceptable standards, I reference this recent expose' on YouTube as a great example of the challenge.

http://progressvideo.tv/videos/is-youtube-doing-enough-to-fight-hate-speech-and-conspiracy-theories

The Ether1 has the unique ability with its technology and decentralized approach to not let any person or persons (The "Content Committee" for example) police the network but the network operators, users and community members themselves can decide what is objectionable and not. And furthermore, the ability to let the "network" operate a Content Kill Switch (CKS*) to remove reprehensible data.

This ECP Proposes a Formal Discussion of Community Standards and to gather ideas for the CKS which will be a future ECP with an implementation budget.

Budget $0.

*CKS -- Community Kill Switch or Content Kill Switch, both seem to ring nicely.

Bigpiggy01 commented 4 years ago

I very strongly agree on the child pornography/similar content needing to get immediately hammered. However, taking this further than say child pornography and or snuffish content is really several steps too far and very much against the spirit of crypto in general. If we are to be decentralized this kind of stuff has to be kept to an absolute minimum as it in and of itself is a centralizing agent.

The mechanism for this has to be exceptionally well though out as we do not want to waste community time and resources on yahoos from either side of the political spectrum getting upset and cross reporting content.

hashratez commented 4 years ago

The framework that is developed should be totally neutral and content controlled by the community. It is not about any one person or even persons. Whom are you, me, or them to decide who is a Yahoo? The community should decide. Now, how we do that is indeed a very very interesting challenge.

At this point I only have some general ideas on the concept with these thoughts. It is impossible for anyone to police the network, so the community needs to do it. And in fact even our community probably can't do it so anyone should be able to "report" the content. The report could be a simple submission form on the ethofs.com webpage for example. The report would go to a queue. In the perfect world that would trigger a voting smart contract that would have parameters set that would allow the community to vote yes or no to take down the content. How you define the community in itself is an issue. It is node owners since the content in theory is on their box? Is it ETHO owners since they are owners of the project? Is it a combination? I have no idea. But the CKS should not be any single person. It should be a Smart Contract that has the mechanism to remove reported content.

Bigpiggy01 commented 4 years ago

There is an option to sort images and video files by hash so no one would have to sit and verify actual child pornography or similar. The problem with that however is that law-enforcement have been abusing it to track other content as well.

Also, I am not sure that the hosting we have attached offers an easy to use moderation solution. this may have to be built from the ground up.

No one should be deciding who is and isn't yahoo the reporting system needs to sort those out before we get human eyes on anything. The way I see things is a Yahoo is let's say a donkey who consistently reports all elephant content because fakebook has trained them to be like that or the other way around.

Like you said, we are still at the formulating ideas stage for many things.