Closed BlueHotDog closed 3 years ago
If there's a memory leak, I would think because you are holding on to the references to the stream outputs for too long. After processing a chunk, the async streams drop all references to it, as far as I can tell, but I'll investigate as well.
Also, the second error looks more like an attempt to push the same Uint8Array to AsyncGzip that you use yourself. The docs mention that all pushed chunks are consumed in async streams (they become inaccessible after pushing). If you try to read from some data you've already pushed you may get this error. If it's actually from fflate, please send the full stack trace.
Unfortunately the error is coming from an extension and is minified so hard to procude a readable trace. Also still struggling to reproduce locally, But let me try and give some more context that might help:
Basically the function we've is: try gzipping, if it succeeds, return the gzipped buffer. if any error happens(try/catch) return the original buffer
I suspect a memory leak since when i'm doing two runs and doing a memory compare it seems like a lot of retained buffers are still there.
Still looking into this issue, maybe it applies to only a specific set of environments. What browser/engine are you using?
Hey, we're still investigating on our end. might be an issue on our end. Closing for now, will re-open once i've more information if applicable. Thank you so much for the quick response! and sorry for bothering.
Uploading multiple files from a worker, creates a memory leak and eventually throws the following error: "Cannot perform Construct on a detached ArrayBuffer" Looking at the memory snapshot between two runs it seems like the created array buffers are not being cleaned after running the compress method