Joystream / joystream

Joystream Monorepo
http://www.joystream.org
GNU General Public License v3.0
1.42k stars 115 forks source link

Argus: content delivery authentication #3152

Open bedeho opened 2 years ago

bedeho commented 2 years ago

Background

Right now, anyone can fetch any content from any distributor. This unconstrained form of service provisioning is not really workable, and leaves the economics of the system too vulnerable to parasitical uses. The long term goal is for scheme where gateways foot the bill for clients, however, some basic form of authentication makes sense even before we get that far. A separate problem is that it means that unlisted videos basically are accessible to the world, despite being hidden from the interface of compliant UIs.

Proposal

Distributors authenticate requests to fetch data, and require that there is some sort of Gateway specific authorization. This will require changes to Orion, and possibly also the runtime - at least to represent gateways as actors, and their endpoints. The main design challenge here will be how to make this authentication be efficient, in terms of how often it must be done, despite the fact all distributors and orion instances do not have any shared state. Certainly doing some sort of interactive challenge response step for every client request would not work. We should also be sensitive to the problem of requiring any sort of signing by the client, as that will create horrible UX if they had to sign with a wallet account. Clients should instead be able to be gateway users without a wallet, $JOY or a membership.

Note that, to support blocking access to unlisted media files, you have to be able to discern who the viewer is. The owner still wants to be able to view the video from the infra most likely, its just not supposed to be available to everyone else. Perhaps also content working group actors? Unsure.

traumschule commented 1 year ago

To be honest I had the same idea at first but meanwhile i don't see this going anywhere. You built a distributed communication device that was intended to be censorship free. In the process we learned there may be cases where DAO / providers are better served to comply with legal requests or be proactive. Yet the suggested authentication towards GW will only "protect" against "unauthorized" visitors. GW may still endanger the provider by serving content against local laws which might result in legal consequences for the provider (not necessarily the GW, see Piratebay, 2012). As effect there will be different trust levels and prices between providers and GW.

The way i see it is that curators are hired to screen content and signal possible risks connected to assets to SP/DP, who are still bound to serve all channels they agreed to towards their WG lead and can't just drop single assets (at least we don't want them to) or get punished.

There are multiple outcomes (and likely more than listed here) for when curators warn Storage / Distribution about content:

Note this is not about not liking (political) content but to prevent takedown notices by judicial entities who will be curious whom to blame. Their interest might not be how our DAO is structured but purely whose IP address appeared in a log. Providers might want to protect themselves by limiting their own data stored about consumers.

Apart from the above the economic effect for the DAO may even be higher serving highly wanted content vs. upholding moral standards. We don't know how the council will prioritize those aspects in the future.

However the suggested feature is no question a possible enhancement that providers may adopt. The challenge is to establish a communication protocol that eases negotiations between GW and DP which end-user addresses are allowed to access which assets (so far neither argus or colossus have a concept of country based white- / blacklists per asset). The GW would have to (pay in advance and) and inform a provider that an asset was requested by a user with the goal that the provider dynamically whitelists that asset for a specific address. Which in some cases may not be the preferred way for users: someone paid for content and expects to download the file from different locations. In that case the GW would be interested to offer their customers a way to whitelist multiple locations per user. This process could still be too complicated for some users.

bedeho commented 1 year ago

Thank you for the reply, but regretfully this had too many points that were mixed together in such way that I am not really sure what exactly you are trying to say, let me just respond to some things and you can perhaps clarify.

To be honest I had the same idea at first but meanwhile i don't see this going anywhere

I don't really understand what you mean here in terms of not going anywhere.

If you mean not a) important: then I think you are incorrect, without this a core economic value driver does not exist for creators and the platform, and apps can use the system parasitically. b) technically feasible: then I think you are incorrect, but you'd have to be more specific c) going to be done by the DAO: then I have no clue, but it certainly should.

You built a distributed communication device that was intended to be censorship free.

I don't think that is the case, it is no more censorship free than YouTube or Rumble choose to be. This is a policy decision for the DAO, but there is certainly no imperative or constraint that it has to allow all assets in the system, I think that would be disastrous for quite obvious reasons. Nothing Jsgensis has ever stated has been along the lines of what you are saying.

The way i see it is that curators are hired to screen content and signal possible risks connected to assets to SP/DP

This whole proposal really has nothing to do with restricting the presence of content on the system, which seems to be what you are mostly writing about. It is about restricting who can access whatever content is on the system. Currently, there are no restrictions, which is not workable.

I do suspect you have not understood what this proposal is really about, check out these more specific issues that relate to this topic

traumschule commented 1 year ago

Sorry for being so unclear. No i believe it is crucial economically for the DAO to control who consumes bandwidth but it can't protect content from being copied out of the CDN and distributed without restrictions (which is besides the point).

Without having seen your revenue forecasts (and there might be alternative ways to foot the bill) I'd lean towards d) that approach might actually harm the DAO.

I might be totally wrong and it's actually worth getting more feedback about this question via a survey for example.

This unconstrained form of service provisioning is not really workable, and leaves the economics of the system too vulnerable to parasitical uses.

Let's find out why it's not workable. The assumption is that someone on the internet uses up bandwidth without paying (watching ads or subscribing commercially). Youtube's answer to this is not no access but rate limiting (~ 50 KB/s) which seems to be the more elegant solution. Hope that helps.

traumschule commented 1 year ago

That said there might be an easy solution for GWs to access premium bandwidth by for example submitting a signed timestamp, , signed with the GW worker key, and their workerId in the header. That way no changes to orion or the runtime are necessary and only argus / colossus need to verify the header if present and be able to allocate bandwidth per request.

In my eyes the outcome of #3806 will determine if apps will prioritize the DAO CDN over alternative format providers hired by the DAO as GW for example. For the short-term i see a service-first, runtime later approach to make gleev actually enjoyable adopting adaptive bitrate and DASH.

bedeho commented 1 year ago

but it can't protect content from being copied out of the CDN and distributed without restrictions

Ok, but that is clearly true indeed, this is also true for Spotify, Netflix and YouTube right now, not sure what to do about that, or that anything can be done about that.

Youtube's answer to this is not no access but rate limiting

If you want to rate limit, that still would imply some authentication for those who are not rate limited, moreover, there are other reasons to authenticate beyond mere monetization, if you look at the list above.