Closed loretoparisi closed 1 year ago
I'm facing a similar issue, except that my file is twice the size it's supposed to be.
I was originally referring this to extract my file.
Looking at your code again, I realized I was using a string to compile the data. I switched to using an array, and it worked for me:
const extract = tar.extract()
const buffer = []
extract.on("entry", function(header, stream, cb) {
stream.on("data", function(chunk) {
if (header.type == "file" &&
(!fileToExtract || getFileName(header.name) == fileToExtract)
) {
buffer.push(chunk)
}
})
stream.on("end", function() {
cb()
})
stream.on("error", function() {
cb()
})
stream.resume()
})
extract.on("finish", function() {
const bufferedData = Buffer.concat(buffer)
fs.writeFile(localFilePath, bufferedData, () => undefined)
})
return await fs.createReadStream(tarFilePath)
.pipe(zlib.createGunzip())
.pipe(extract)
My file is only around 6MB though.
I have to extract a
.tar.gz
archive. My solution was to pipe atar-stream
extract togunzip
in this way:but the resulting file size is less than 200KB, while the source file was 900MB (3GB when extracted).