Open 2072 opened 5 years ago
Doesn't this contradict the network's aim to stop censorship?
@SamTebbs33 no single party is in control of the network, so there is no censorship in the traditional sense of it. On the other hand, individual miners should have the possibility to opt out of storing particular illicit content. On the global scale, it means the more miners reject a particular piece of data (be it by coincidence or by coordination), less likely it is (and if so, the harder) to find it on the weave.
@ldmberman is totally right. I can't imagine things would go well for a node owner harboring child pornography for example. However, if all it takes is excluding meta data, I don't think there is a great way to prevent it. Since this is an analog legal issue, maybe it makes the most sense to have nodes maintain individual legal agreements that at least try to filter specific meta data. Then at least they can say that the content was against their legal policy. Pretty sure that's how cloud storage services get away with storing potentially illegal content. Either way, this is something that should be addressed before a node owner gets burned, especially since currently node owners have a tendency towards burning one another with false reports to service providers.
Because voting already works so well in Quora, StackOverflow, Forums, etc etc...
You can't ever use sentiment for anything - the crazies/spammers/morons/outraged/deniers/alarmists/extremists/... outweigh the balanced/knowledgable by more than an order of magnitude.
Any updates on this?
The solution isn't one that can work as far as I can tell. Anyone can flag anything they want as hate or porn (whether or not it really is), or more to the point - this idea gives everyone the ability to destroy any content they want just by spawning some bots to fake-flag stuff.
The only way I can imagine this could possibly work, is for some Oracle to be involved, who is incentivised to accurately tag the content of others, and penalised for doing it wrongly (e.g. for being an outlier in an anonymous "consensus vote" on content attributes). You could probably piggy-back of existing content-censorship solutions (e.g. post all images to facebook, and monitor if the facebook morality team cuts them) - but that's not scaleable (or, ironically, moral either - stealing facebook employee time to classify stuff).
IMHO - and controversially - the absolute best idea is just to kill Arweave - almost the only genuine use case for Arweave is actually trying to bypass censorship and (generally-speaking) to try and keep despicable stuff alive. Sometimes, the best idea is not to build something that is going to be monopolised by the bad guys in the first place. Nobody actually needs Arweave right now - there's plenty of other ways to do more-or-less equivalent things, with none of the evil consequences plaguing those.
An oracle would be a single controllable point of censorship.
Somehow stopping Arweave: You could use the same arguments for the entire internet. Better it is closed down for the sake of the children, or terrorists, or whatever excuse is the flavour of the time, and they do that in countries like North Korea. Not the best solution IMO.
@rosmcmahon you realise this topic is specifically intended to facilitate a censorship solution for Arweave, right? For dealing with despicable content ?
The Internet already has an enormous ecosystem of Censorship, along with identity and other lawfully enforceable protections, so your argument makes no sense. "Shut down the internet" is absolutely nothing the same as "give up trying to build something that will be used almost exclusively by the scum who need to bypass the enforcement of existing censorship".
If you haven't noticed already - everything new online that can in any way be used by bad people for bad things, tends to end up being monopolised by those bad people to expand their nasty activities. Your head is buried in the sand if you think that "end censorship" means "yay to morally correct free speech". No. What it means is "need to say something that nobody wants you to say? Use Arweave". That means almost everything that the entire internet censorship system is built to protect us all from, will gravitate to you. "Who controls the past controls the future" is missing the fundamental problem - "Who controls the misinformation narrative, controls the future". Taking away our ability to combat lies is not a good thing. Sure, censorship is use in other ways as well, but it's used morally correctly more often than not.
I'd say the premise that the primary purpose of Arweave is censorship-resistance is wrong. The goal of Arweave is to facilitate storage in a particular way - with a pay-once model, a certain number of replicas, no vendor lock-in, attractive for integration with other blockchains. As of censorship resistance, those who serve data publicly bear some legal responsibility, no matter whether it is Arweave data or not. Miners are not required to store everything that is in the weave.
The answer is allowing network participation for nodes that only contain transactional data like ownership records (and minified smart contracts).
That way external parties, that are completely off chain, can put together filters that erase all the bad (or unwanted) metadata and therefore prevent the distribution of bad content.
With that, if a node doesn't have the non-transactional data, then the requesting node just has to keep scanning through the node list for one that does.
The advanced answer is to organize on chain immunology representatives that actually have the ability to erase transactions like smart contracts from the history of the "community nodes" while still allowing for "forensic nodes" to exist that have everything.
Lately there has ben a shift in the way data providers are deemed responsible or not concerning the content they make available on the Internet. From general irresponsibility we're shifting towards guiltiness by default.
This will become a problem for Arweave's nodes very quickly. This needs to be addressed, preferably at the protocol level.
I open this issue so that ideas on how to address this can be gathered from the community.
So below I expose the state of my current reflection on the issue and give some ideas:
It would be interesting to track and store a kind of "Community Sentiment" (CS) along with the stored data. Miners/stake holders could vote on data to change this score that would be available with the data so that we know somehow that a particular stored element, even if stored in the weave forever, was mostly rejected by the community and considered wrong. (How to weigh the votes? By stakes? This could be another interesting insensitive for miners and holders with consequences requiring attention...)
This way, nodes would be also able to use this score to filter content servicing completely automatically.
Moreover this kind of metadata would also be interesting from an historian perspective...
Now, the above 'solution' would only account for generally wrong content, such as child pornography and the like. The solution needed would require more than just a general moral score. If some content is illegal in your country and you happen to run an Arweave node capable of delivering such content, you're in trouble...
So, I think that a broader metadata needs to be defined, maybe by country and attributes, for example:
So each node would be able to read the metadata and decides to deliver the content or not and even store it or not (in some countries it's even illegal to possess certain content!).
For now the easiest way to implement such a feature would be through the Chrome extension where users would be able to review stored data and create the associated metadata by creating a particular transaction...
This solution would prevent broad censorship but still allow for some regulation in the parts the world where it's deemed necessary...