Open spinlud opened 2 months ago
Appending a Stream instead of a Buffer seems to be a possible solution. Any thought?
import {PassThrough} from 'stream';
import archiver from 'archiver';
(async () => {
const inputStream = new PassThrough();
const outputStream = new PassThrough(); // Pass this to some upload function
const archive = archiver('zip', {
zlib: { level: 9 }
});
archive.pipe(outputStream);
archive.append(inputStream, { name: `myfile.txt` });
// Assume this data can't fit in memory all at once, so it should be added incrementally
for (let i = 0; i < 100; i++) {
const textData = `data-${i}\n`;
inputStream.write(textData, 'utf8')
}
inputStream.end();
await archive.finalize();
})();
I would like to build a single zip file incrementally. What I have tried:
Using
archive.append(Buffer)
on a single file doesn't seem to work properly, only the first appended data is included, the rest id discarded. Using different file names in the append operation would result in multiple zipped files, instead of a single zip file.Another question: is the data flowing from the archive stream to the destination stream only when we call
archive.finalize()
? If this is the case, then I would still loading all the data in memory. Ideally I would like the data to flow from archive stream to destination stream as soon as available.Any solution for this?