mafintosh / tar-stream

tar-stream is a streaming tar parser and generator.
MIT License
412 stars 93 forks source link

Adding multiple large entries? #99

Closed gregory-h closed 1 year ago

gregory-h commented 5 years ago

I'm hoping to use this to stream a tarball backup that would be too large to fit in memory.

My algorithm

in an async repeat loop

at completion of the async loop

The problem is that on my second itration of the async loop pack.entry(..) returns null, because this._destroyed is set at tar-stream/pack.js:110.

I see the issue pertaining to lack of support for multiple concurrent entries, which is why I've structured my algorithm to wait for the completion of one entry before starting another, but this does not seem to work - wondering if what I'm trying to do is achievable?

Thx,

abrinckm commented 4 years ago

@gregory-h try the solution here: https://github.com/mafintosh/tar-stream/issues/24

bentorkington commented 4 years ago

I have implemented this using async.eachOf. Here's a simplified example:

async.eachOf(files, (file, next) => {
    // pipe each file one at a time
    fs.createReadStream(file).pipe(pack.entry({ name: file }, (err) => next(err));
}, (err) => {
    // called once each iteration is done. todo: handle error here
    pack.finalize();
});