archiverjs / node-archiver

a streaming interface for archive generation
https://www.archiverjs.com
MIT License
2.76k stars 218 forks source link

ZIP consume all memory and return null from lambda function response #764

Open hardikbhalgama opened 1 month ago

hardikbhalgama commented 1 month ago

I'm facing problem in two cases.

  1. When 1000+ files with single execution.
  2. When 10+ files with multiple time calling lambda for create zip.

In above two cases i'm getting lambda function response null without any error. And this error throws until l'm not deploying lambda function again.

Below is my code. ` const fileKeys = await this.listFileKeys(); const archive = archiver("zip"); const passThroughStream = new stream.PassThrough(); const { uploadId } = await createMultipartUpload(this.bucketName, this.destinationKey, this.aclType);

for (const file of fileKeys) {
  const downloadStream = await createDownloadStream(this.bucketName, file);
  const name = `${file.split(this.sourcePath + "/").pop()}`;
  archive.append(downloadStream, { name });
}

archive.pipe(passThroughStream);
archive.finalize();

const buffer = await streamToBuffer(passThroughStream);
const chunkSize = 5 * 1024 * 1024; // 5 MB chunks
const parts = [];

for (let i = 0; i < Math.ceil(buffer.length / chunkSize); i++) {
  const start = i * chunkSize;
  const end = Math.min((i + 1) * chunkSize, buffer.length);
  const chunk = Buffer.from(buffer.subarray(start, end));
  const { ETag } = await uploadPart(this.bucketName, this.destinationKey, uploadId, chunk, i + 1);
  parts.push({ ETag, PartNumber: i + 1 });
}

await completeMultipartUpload(this.bucketName, this.destinationKey, uploadId, parts);
logger.logDebug(`Completed multipart upload to S3. Destination key: ${this.destinationKey}`);

return { status: true, errorMessage: "", destination_key: this.destinationKey };

` Any help is appriciated.