Closed drzraf closed 4 years ago
Hey :wave: This chunk size comes from the browser's implementation of file.stream()
, not from this library. See https://github.com/w3c/FileAPI/issues/144.
However, if you want a specific chunk size, you can use this library to do so. See this example in the readme to get a chunk size of 1024 bytes (just remove encryptor.process
and write chunk
as-is). You can either do this before encrypting (if you want it to encrypt n bytes at a time) or after encrypting (if you want all chunks to be n bytes).
Alternatively, if you just want to read the encrypted data 1MB at a time, for example, you can simply do:
while (true) {
const chunk = await _reader.readBytes(1000000);
if (chunk === undefined) {
break;
}
console.log(chunk);
}
Great!
readBytes
fits! I just found worrisome to see it implemented as a simple read()
in a while
loop when I expected a higher-level slice
-like functions (supposedly more efficient?).
Anyway I found performance not to be a problem.
Interesting fact: with openpgp.enums.compression.zlib
set, read()
size drops from 64k down to 513 bytes.
Example of 256MB of 1
(compressed to 354kB
):
Read 3 bytes
Read 268 bytes
Read 3 bytes
Read 524 bytes
Read 1 bytes
Read 513 bytes (492 times)
Read 1025 bytes
Read 513 bytes (211 times)
Read 280 bytes
counter: 8942 ms
Total bytes read: 362743
(Using 4 x readBytes(100000)
make it in 7 seconds)
Thank you!
Output when encrypting a 8MB file:
Seems that a 64kB + 1 size is set by default. How to configure this chunk size?
(btw I'm happy openpgp.js could encrypt 512MB in ~ 22 seconds using less than 11 MB of memory)