archiverjs / node-archiver

a streaming interface for archive generation
https://www.archiverjs.com
MIT License
2.76k stars 218 forks source link

Create a single zip file incrementally? #760

Open spinlud opened 2 months ago

spinlud commented 2 months ago

I would like to build a single zip file incrementally. What I have tried:

import {PassThrough} from 'stream';
import archiver from 'archiver';

(async () => {
    const stream = new PassThrough(); // Pass this to some upload function
    const archive = archiver('zip', {
        zlib: { level: 9 }
    });

    archive.pipe(stream);

    // Assume this data can't fit in memory all at once, so it should be added incrementally
    for (let i = 0; i < 100; i++) {
        const textData = `data-${i}`;
                // The problem is here (data is not appended to the same file). If using different file names, this result in multiple zip files instead of a single zip file
        archive.append(Buffer.from(`${textData}\n`, 'utf8'), { name: `myfile.txt` });
    }

    await archive.finalize();
    await upload;
})();

Using archive.append(Buffer) on a single file doesn't seem to work properly, only the first appended data is included, the rest id discarded. Using different file names in the append operation would result in multiple zipped files, instead of a single zip file.

Another question: is the data flowing from the archive stream to the destination stream only when we call archive.finalize()? If this is the case, then I would still loading all the data in memory. Ideally I would like the data to flow from archive stream to destination stream as soon as available.

Any solution for this?

spinlud commented 2 months ago

Appending a Stream instead of a Buffer seems to be a possible solution. Any thought?

import {PassThrough} from 'stream';
import archiver from 'archiver';

(async () => {
    const inputStream = new PassThrough();
    const outputStream = new PassThrough(); // Pass this to some upload function
    const archive = archiver('zip', {
        zlib: { level: 9 }
    });

    archive.pipe(outputStream);
    archive.append(inputStream, { name: `myfile.txt` });

    // Assume this data can't fit in memory all at once, so it should be added incrementally
    for (let i = 0; i < 100; i++) {
        const textData = `data-${i}\n`;                
        inputStream.write(textData, 'utf8')
    }

    inputStream.end();  
    await archive.finalize();   
})();