flowjs / flow.js

A JavaScript library providing multiple simultaneous, stable, fault-tolerant and resumable/restartable file uploads via the HTML5 File API.
Other
2.96k stars 346 forks source link

Add doc about encryption #337

Closed bertrandg closed 3 years ago

AidasK commented 3 years ago

Nice job! @drzraf @evilaliv3 Can you think of any additions to this?

drzraf commented 3 years ago

An asymmetric StreamEncryptor?

class StreamEncryptor {
    constructor(gpgKeys) {
        this.gpgKeys = gpgKeys;
        this._reader = [];
    }

    async init(flowObj) {
        const { message } = await openpgp.encrypt({
            message: openpgp.message.fromBinary(flowObj.file.stream(), flowObj.file.name),
            publicKeys: this.gpgKeys
        });

        this._reader[flowObj.uniqueIdentifier] = openpgp.stream.getReader(message.packets.write());
        flowObj.size = flowObj.file.size + compute_pgp_overhead(this.gpgKeys, flowObj.file.name);
    }

    async read(flowObj, startByte, endByte, fileType, chunk) {
        const buffer = await this._reader[flowObj.uniqueIdentifier].readBytes(flowObj.chunkSize);
        if (buffer && buffer.length) {
            return new Blob([buffer], {type: 'application/octet-stream'});
        }
    }
}

var encryptor = new StreamEncryptor(gpgKeys);
new Flow({
    // ...
    asyncReadFileFn: encryptor.read.bind(encryptor),
    initFileFn: encryptor.init.bind(encryptor),
    forceChunkSize: true,
});
bertrandg commented 3 years ago

@drzraf nice! :) I've never used openpgp and curious about it, I see it encrypts message (so file bytes here) with recipient public key. So using stream you need to upload chunks one after one? (no chunk upload parallelization)

In my case, I use AES symmetric keys (to encrypt/decrypt files) managed with RSA asymmetric key pairs.

We can modify the page and just put our 2 file encryption methods.

AidasK commented 3 years ago

@bertrandg that would be awesome, can you add it please? 👍

drzraf commented 3 years ago

@drzraf nice! :) So using stream you need to upload chunks one after one? (no chunk upload parallelization)

The trick is that my query() {} uploads to a ${chunk.offset}-derivated URLs in order to create OpenStack Swift DLO (which take care of reassembling itself).

Since each chunk has it's own URL I can upload in parallel : Thanks to v3 stream support and respect for simultaneousUploads and since encryption is quick in comparison to upload, it ends up being parallel uploads of sequential (and non-racy, thanks to readStreamGuard / readStreamChunk) stream read.

bertrandg commented 3 years ago

I've updated the doc file with your example @drzraf

AidasK commented 3 years ago

Sweet, can we merge it @drzraf ?