rust-lang / flate2-rs

DEFLATE, gzip, and zlib bindings for Rust
https://docs.rs/flate2
Apache License 2.0
891 stars 158 forks source link

write::GzEncoder doesn't encode a huge buffer #271

Closed ababo closed 1 year ago

ababo commented 3 years ago

When I pass a 6MiB-buffer it produces an obviously incomplete 60KiB file. It works only when I feed it with smaller chunks (with resulting file of 2MiB).

MichaelMcDonnell commented 1 year ago

Hi @ababo, I cannot reproduce your issue. I wrote an example that uses GzEncoder to compress a 10 MB buffer and it works fine on my machine (AMD Ryzen 5 3600X, Windows 10, flate2 version 1.0.24 with the default features):

use std::{
    fs::File,
    io::{self, Write},
    iter::repeat_with,
};

use flate2::{write::GzEncoder, Compression};

fn main() -> Result<(), io::Error> {
    // Create test data with ten million 42's.
    //let buf = vec![42; 10_000_000];
    // Create test data with ten million random ASCII characters
    let rng = fastrand::Rng::with_seed(42);
    let buf: Vec<u8> = repeat_with(|| rng.u8(32..126)).take(10_000_000).collect();

    let f = File::create("huge_buffer.txt.gz")?;
    let mut gz = GzEncoder::new(f, Compression::default());
    gz.write_all(&buf)?;
    gz.finish()?;
    Ok(())
}

Does that work on your machine?

Could you tell us more about your machine, operating system, version of flate2 and how you are using flate2?

emilioparker commented 1 year ago

I have seen something similar when mistakenly I used write instead of write_all, it is hard to spot because write will work just as write_all for small chunks of data. write has an easy to ignore return value...

JohnTitor commented 1 year ago

Closing as no one could reproduce your issue and there's no follow-up comment from the author. Feel free to re-open with information to reproduce if you still have an issue. Thanks!