Closed XSven closed 9 months ago
Hey @XSven
thanks for the report. It sounds like there is something unusual with the file you are uncompressing. Is it available for me to test?
Can you see what gunzip
thinks about the file by running this
gunzip -t ~/tmp/file.gz
One other thing to try is to add the option MultiStream => 1
when creating the gunzip object
my $z = IO::Uncompress::Gunzip->new( $input, Append => 1, MultiStream => 1 )
I have checked the integrity first and the file is proper. I was also able to uncompress it manually using gzip
. The MultiStream => 1
option has solved my problem (Thx)! I have found more information here: Dealing with concatenated gzip files
From my perspective a warning should be raised if IO::Uncompress::Gunzip
detects multiple data streams and the MultiStream
option is off. Could this be implemented?
I have checked the integrity first and the file is proper. I was also able to uncompress it manually using
gzip
. TheMultiStream => 1
option has solved my problem (Thx)! I have found more information here: Dealing with concatenated gzip files
Excellent!
From my perspective a warning should be raised if
IO::Uncompress::Gunzip
detects multiple data streams and theMultiStream
option is off. Could this be implemented?
Need to research that a bit more to understand the implications. Added to my TODO list.
Closing - issue added to TODO list.
I do have a big gz file
I am processing the file with the t/bigfile.t test script
The output is this
My expectation is that at least one of the two assertions
should fail because the unzip action stops prematurely after 64MB without raising a warning or an exception.
I have used
PerlIO::gzip
as an alternative method to uncompress. The phenomenon is the same. Reading and uncompressing the data stops after 64MB. I do not understand why!