oleg-st / ZstdSharp

Port of zstd compression library to c#
MIT License
200 stars 29 forks source link

Chunk decompression? #21

Closed anatoly-kryzhanovsky closed 1 year ago

anatoly-kryzhanovsky commented 1 year ago

good day i have question about usage the library

i have zstd file uploaded to S3 storage. file is about 2 GB and i can't download and process it localy, so i try to open s3 object as stream, pass that stream to DecompressStream and use StreamReader over decompressor. it works perfectly locally, but when i run this in prod env i faced an issue: The response ended prematurely, with at least XXXXXXXXXX additional bytes expected.

if i unserstand correctly one reason of that is unstable connection. i see only one way to solve that - read compressed file in s3 by chunk, decompress chunk, process it and after that move to next chunk

but i cannot find any way to achive that is it possible? can you provide correct way for that situation?

oleg-st commented 1 year ago

Hi, I think you can wrap your unstable network stream with your own steam tolerant to connection losts.

Schema looks like: DecompressionStream -> your stream -> s3 network stream

Your stream reads data from the inner network stream and, if the connection is lost, can reconnect and read data from the appropriate offset.

DecompressionStream will read from your stream without ever encountering the problem of disconnection.

anatoly-kryzhanovsky commented 1 year ago

ok, good point, thank you!