101arrowz / fflate

High performance (de)compression in an 8kB package
https://101arrowz.github.io/fflate
MIT License
2.27k stars 79 forks source link

Website bug: seems to not be compressing anything #47

Closed feross closed 3 years ago

feross commented 3 years ago

How to reproduce

The problem

The compressed size is unrealistic, which makes me think that the file was not processed correctly. Instead an empty string or something similar was processed.

Screen Shot 2021-03-10 at 12 19 04 PM

Browser: Latest Chrome OS: Latest macOS

101arrowz commented 3 years ago

That's odd. I've tried this with multiple massive files and haven't had this problem.

Could you try the "streaming GZIP" preset? It's likely that the browser failed to load the 2.7GB of data into memory. Streaming avoids this problem.

feross commented 3 years ago

The streaming example just hangs:

Screen Shot 2021-03-10 at 12 27 00 PM
101arrowz commented 3 years ago

Compressing 2.7GB of data can take around a minute or two, longer on slow computers. I'll run the demo on a large filestream and let you know my findings.

feross commented 3 years ago

Actually, I might have misunderstood the UI -- I'll let it run and report back if it finishes.

feross commented 3 years ago

It looks like the streaming example throws an exception:

Uncaught (in promise) TypeError: Failed to fetch
Promise.then (async)
eval @ VM459:12
Br @ sandbox.ts:144
onClick @ index.tsx:510
j @ preact.module.js:1
101arrowz commented 3 years ago

I found the issue, I'm trying to concatenate the output buffer into an ArrayBuffer, which throws because it's over 2GB, the max in browsers. Will fix by only calculating the length rather than concatenating.

feross commented 3 years ago

Makes sense, thanks for the quick debugging!

101arrowz commented 3 years ago

Try pasting this into the code box:

const { AsyncGzip } = fflate;
// Theoretically, you could do this on every file, but I haven't done that here
// for the sake of simplicity.
const file = files[0];
const gzipStream = new AsyncGzip({ level: 6 });
// We can stream the file through GZIP to reduce memory usage
const gz = file.stream().pipeThrough(toNativeStream(gzipStream));
let sz = 0
gz.pipeTo(new WritableStream({
  write(dat) { sz += dat.length; console.log(sz); },
  close() { callback('Length: ' + sz); }
}));

It took a while but worked for me.

feross commented 3 years ago

Perfect, that worked!