LibertyDSNP / spec

The DSNP Spec and Website
https://spec.dsnp.org
Other
28 stars 3 forks source link

Request for Proposals: Content removal affordances and semantics #259

Open wesbiggs opened 10 months ago

wesbiggs commented 10 months ago

This issue is derived from #184 and ensuing discussions.

Definitions

Note: These are my interpretations of these concepts for the purposes of this discussion, and may not completely cover all colloquial usage of the terms. If you disagree with these assertions or feel different terminology is needed, please feel free to comment.

Regulatory compliance refers to the ability of a government or related institution to regulate and enforce the removal of speech it deems to be illegal. The specific acts of speech deemed illegal may vary from jurisdiction to jurisdiction and may change over time. Regulatory compliance often extends liability or creates obligations to service providers that are involved in the hosting or dissemination of content. It may also create obligations with respect to the handling of user data and data removal requests from users.

Contractual compliance refers to the ability of a service provider to enforce a contract (typically in the form of terms of service) with its customers (users), and often extends into rules for conduct and speech. Contractual compliance implies certain powers that are available to the service provider, including content removal, contract termination, and potentially fines.

Censorship is typically the name given to enforcement of regulations that are politically motivated and generally not universally accepted; through that lens, censorship is by nature on opinion one holds about the legitimacy of regulatory or contractual limitations on speech.

Moderation is the ability of an appointed entity to enforce speech norms in a specific context. Content that is otherwise compliant (legally and contractually) may still be subject to moderation depending on the context. Moderation can take the form of content screening, or can be applied post-hoc. Traditionally, the operators of social networking platforms have either performed their own moderation or provided tools for users to self-moderate groups or discussions.

Curation has some conceptual overlap with moderation, but can be thought of as the ability of entity that controls a user experience to prioritize or promote certain content, and deprioritize or demote other content, in order to impact the likelihood that it is seen by the viewer. Curation is increasingly done algorithmically by social media platforms, but often has manual aspects. Users may have agency (alongside platform operators) in curation, such as choosing what users or topics to follow.

Content, for purposes of this discussion, means any speech via DSNP in the form of announcements.

Goals

We want DSNP to be practical for service providers and safe for users. What do we mean by this, and how do we achieve these goals?

Practical for service providers: Service providers should have the ability to enforce regulatory and contractual rules about the content that they (1) host and/or (2) disseminate. They should also be free to moderate and curate the experiences of users that they contract with.

Safe for users: Users should be free to seek out service providers that align with their values. They should feel safe that the DSNP ecosystem is not a haven for illegal content or activity. They should have agency over their own data and content.

Content removal

Content removal can take several forms.

Note that the nature of a DSNP system precludes the ability to remove content announcements (that is, metadata containing the content URL and hash) from batch publications, so that is out of scope.

Affordances

Specifically, we are seeking proposals for the technical tooling, at the protocol level, that is required to provide the affordances necessary to meet these goals.

This might include:

User stories for content removal

The proposal should address one or more (and maybe even all) of the following scenarios:

  1. As a user, I want to remove an item of content I have previously announced via DSNP. This should be possible regardless of my current relationship with the provider hosting this content. (Currently implemented via Tombstone Announcement, with semantics of network removal.)
  2. As a service provider, I want to remove an item of content that violates my terms of service. This should be possible regardless of my past or current relationship with the user who created this content.
  3. As a designated moderator, I want to remove an item of content that violates my community's rules.
  4. As a service provider, I want to remove an item of content for regulatory compliance reasons. This should be possible regardless of my past or current relationship with the user who created this content.
  5. As a service provider, I no longer wish to host a piece of content because I am no longer contractually required to do so.

Submissions welcome

Discussion, ideas, or submissions are all gratefully accepted. Submissions should describe the changes required to the DSNP specification, but a formal pull request to the spec repo is not necessary.

wesbiggs commented 4 months ago

Partially addressed with #273

wesbiggs commented 4 months ago

A good discussion on blocklists in IPFS: https://github.com/ipfs/notes/issues/284

A rejected proposal on blocklist format: https://github.com/ipfs/notes/issues/284

Various rabbit holes descend from those issues.

Protocol Labs operates https://badbits.dwebops.pub/, which has a deny list that's roughly 28MB in size as of this posting (428k entries). Note that there is no metadata included in this file.

One schema for takedown notices can be observed at https://lumendatabase.org/, which is a repository for voluntary submissions of things like DMCA requests.