Closed kevva closed 11 years ago
:+1: Yep, to keep low memory-usage. @wibblymat are the internals prepared for this? Related: #2
:+1:
No. You cannot correctly read a zip file from start to end in a stream. You have to be able to do random access and to search through the file backwards. Node's streams are not rewindable so you can't do that. http://commons.apache.org/proper/commons-compress/zip.html explains the limitations of Java's stream based Zip reader. "External attributes" means things like permissions, owner, etc.
I suspect that some of node-unzip's problems were because they tried to process it as a stream which meant they had to make guesses about the structure which wouldn't always be right.
I do create streams internally for the actual file contents but all of the data structures get held in memory.
Tar files can be streamed no problem. On 16 Aug 2013 00:08, "André Cruz" notifications@github.com wrote:
[image: :+1:] Yep, to keep low memory-usage. @wibblymathttps://github.com/wibblymatis the architecture prepared for this?
— Reply to this email directly or view it on GitHubhttps://github.com/bower/decompress-zip/issues/3#issuecomment-22734440 .
WONTFIX. Sorry.
Ouch. I guess you'd have to settle for something like this (in case you're fetching an archive or likewise):
.pipe(fs.createWriteStream(foo.zip))
.on('close', function () {
decompress.extract('foo.zip', { path: 'bar' });
});