Closed feross closed 3 years ago
That's odd. I've tried this with multiple massive files and haven't had this problem.
Could you try the "streaming GZIP" preset? It's likely that the browser failed to load the 2.7GB of data into memory. Streaming avoids this problem.
The streaming example just hangs:
Compressing 2.7GB of data can take around a minute or two, longer on slow computers. I'll run the demo on a large filestream and let you know my findings.
Actually, I might have misunderstood the UI -- I'll let it run and report back if it finishes.
It looks like the streaming example throws an exception:
Uncaught (in promise) TypeError: Failed to fetch
Promise.then (async)
eval @ VM459:12
Br @ sandbox.ts:144
onClick @ index.tsx:510
j @ preact.module.js:1
I found the issue, I'm trying to concatenate the output buffer into an ArrayBuffer, which throws because it's over 2GB, the max in browsers. Will fix by only calculating the length rather than concatenating.
Makes sense, thanks for the quick debugging!
Try pasting this into the code box:
const { AsyncGzip } = fflate;
// Theoretically, you could do this on every file, but I haven't done that here
// for the sake of simplicity.
const file = files[0];
const gzipStream = new AsyncGzip({ level: 6 });
// We can stream the file through GZIP to reduce memory usage
const gz = file.stream().pipeThrough(toNativeStream(gzipStream));
let sz = 0
gz.pipeTo(new WritableStream({
write(dat) { sz += dat.length; console.log(sz); },
close() { callback('Length: ' + sz); }
}));
It took a while but worked for me.
Perfect, that worked!
How to reproduce
The problem
The compressed size is unrealistic, which makes me think that the file was not processed correctly. Instead an empty string or something similar was processed.
Browser: Latest Chrome OS: Latest macOS