EvanOxfeld / node-unzip

node.js cross-platform unzip using streams
MIT License
614 stars 343 forks source link

Unhandled error: "invalid distance too far back" #60

Open callumlocke opened 10 years ago

callumlocke commented 10 years ago

When trying to unzip a large archive (about 45MB that should expand to about 100MB), it fails with this output:

events.js:72
        throw er; // Unhandled 'error' event
              ^
Error: invalid distance too far back
    at Zlib._binding.onerror (zlib.js:295:17)

Any idea what this means?

molant commented 9 years ago

Having the same error with a 15MB zip file. Did you find a workaround for this?

alarner commented 9 years ago

I'm also having the same issue. Did either of you folks figure out what the problem was? My zip file is about 20mb.

molant commented 9 years ago

I end up using decompress to unzip my files :(

richdunajewski commented 9 years ago

Experiencing the same thing with a 2.7 MB file. I was pulling the file off a server programmatically, and periodically they update the zip (presumably the same process each time), and now it doesn't work. I can open it with Windows just fine, however.

kevinohara80 commented 9 years ago

Same. Bummer.

theartofme commented 9 years ago

I also started having this issue. The problem started when I deployed to an AWS server - the same zip files uncompress without error during local testing.

It must have something to do with specific Node configurations.

kevinohara80 commented 9 years ago

This project is abandonware so I doubt this will be resolved. I'm using Adm-Zip now and it works well. decompress seems like a decent option too.

remram44 commented 8 years ago

Bumping into this too.

MaximilianBuegler commented 7 years ago

The problem isn't in the unzip library. You were probably using the createReadStream method of the S3 getObject response. This seems to pack an http header in front of the zip file, which causes unzip to rightfully complain about this not being a zip file with the aforementioned error.

Solution is to take the Body of the response, which is a Buffer, convert that to a ReadStream and pipe it into unzip. The BufferStream converter class can be found here: https://gist.github.com/bennadel/b35f3a15cb3b03ddbcf8#file-test-js . (No idea why this isn't available as a npm package)

So my solution then looks as follows (loose code snippet):


var s3 = new AWS.S3();
var params = {Bucket: s3bucket, Key: zipFileName};
s3.getObject(params,function(err, queryData) {
                if (err){
                    return reject("Loading file "+s3bucket+" : "+zipFileName+" failed. "+err);
                }
                else{
                    var stream=new BufferStream(queryData.Body);
                    var unzipStream=stream.pipe(unzip.Parse());
                    unzipStream.on('error', function(err){ return reject("Unzipping file "+s3bucket+" : "+zipFileName+" failed. "+err); });
                    unzipStream.on('entry', function (entry) {
                                ...

Hope this helps others with this issue.

mtharrison commented 7 years ago

@MaximilianBuegler can you explain what you mean by

This seems to pack an http header in front of the zip file, which causes unzip to rightfully complain about this not being a zip file with the aforementioned error.

Are you saying when you read the S3 getObject stream it first writes a an HTTP header before the actual body of the object? Surely this can't be true. The library would be completely broken if this were the case.

coockoo commented 7 years ago

For anyone who is looking for an answer. I accidentally bumped into this repo (maintainable fork of this one) https://github.com/ZJONSSON/node-unzipper I just replaced my dependency and require line and everything worked like a charm.