Open regular opened 4 years ago
Update: I now think it's a bad idea to stream data to the client before ssb can check the blob data's integrity. Only if we have all data, we'll know if the sha is correct. Presenting data to a client before would potentially expose it to malicious data. A bad actor can run a pub that responds to all blob requests with made-up data, hoping clients render it progressively and only find out about the fake after it is too late. (virus embedded in a pdf, streaming fake video ....)
Instead I can solve the above use case with a more detailed changes
stream. Such a stream could include pending transfers with progress information that is updated periodically.
I like your train of thought @regular
Heads up I've been refactoring this module to mage it easier to read and maintain.
Also, just checking you know about ssb.blobs.push
You could mess with that and sympathy to force blobs out to all peers (imagining eager loading would be very useful in some kiosk type setups)
Currently, for a peer to get a blob, it has to
The problem here is that the first byte of data will be available to the application only after the blob is transferred from another peer in its entirety. This becomes an issues when we are dealing with very large blobs (I have blobs of multiple GB) because the UI cannot show any feedback about sync progress of a particular blob.
I'd like to add an API that has the same net effect as the two steps above, but streams data to the application while it arrives from another peer and also informs its caller about the total blob size to expect ahead of time, so that a progress bar can be rendered.
Usage would be something like:
meta could be
{size, peer}
I'll probably implement this with an instance of pull-notify per blob id that is live streamed
Any thoughts?