Closed xqdoo00o closed 3 years ago
This is intended behavior. fflate
does not use a temporary buffer during compression, so an input chunks maps to an output chunk in the DEFLATE stream, always. In other words, each push()
carries a 5-byte overhead. Moreover, it's nearly impossible to compress a 16 byte chunk effectively.
The solution is to store an array of Uint8Array
chunks, then concatenate and push when you reach a reasonable block size (say 64kB). In this test case, just use larger chunks than 16 bytes. Hope this makes sense! Let me know if you have any other questions or if I can close the issue.
OK, I know. no other questions
Sorry, but I finally thought this could be a bug. In my case, the chunk size is not controllable, we can't ask our customer to change chunk size to suit the product.😂
Maybe you could create a wrapper class and serve this to your client instead?
class Deflate extends fflate.Deflate {
_chunks = [];
_chunkSize = 0;
push(data, final) {
this._chunks.push(data);
const newChunkSize = this._chunkSize + data.length;
if (newChunkSize > 16384 || final) {
let buf = data;
if (this._chunkSize) {
buf = new Uint8Array(newChunkSize);
let offset = 0;
for (const chunk of this._chunks) {
buf.set(chunk, offset);
offset += chunk.length;
}
}
super.push(buf, final);
this._chunks = [];
this._chunkSize = 0;
}
}
}
this could be a resolution, the same file console result is stream Deflate length 928998
,still 30% bigger than normal size but it could as fast as normal
Sorry, I made a mistake in the above code. This is the code you should use for maximum performance and good compression ratio.
class Deflate extends fflate.Deflate {
_chunks = [];
_chunkSize = 0;
push(data, final) {
this._chunks.push(data);
const newChunkSize = this._chunkSize + data.length;
if (newChunkSize > 262143 || final) {
let buf = data;
if (this._chunkSize) {
buf = new Uint8Array(newChunkSize);
let offset = 0;
for (const chunk of this._chunks) {
buf.set(chunk, offset);
offset += chunk.length;
}
}
super.push(buf, final);
this._chunks = [];
this._chunkSize = 0;
} else this._chunkSize = newChunkSize;
}
}
@xqdoo00o I'm going to close this for now, since the above code fixes your problem (let me know if it doesn't, it works on my end).
How to reproduce
The problem As I tested, the console shows:
The fflate deflate stream result too big and even bigger than the origin data length which is 7832702. So I wonder if the Deflate stream push function has some logic error.