Open otwm opened 5 years ago
I may be wrong, but it seems that AWS Lambda don't support node.js streams and archiver is built around streams.
https://docs.aws.amazon.com/en_us/lambda/latest/dg/programming-model.html
Here is the working example https://github.com/rokumatsumoto/aws-node-s3-zipper
@timopl did you get this working. i keep getting dest.on is not a function. works locally to, but not remote. I do think lambda now supports piping streams to the response.
@ricky11 You can use buffer to collect the stream, and then send payload encoded in base64.
Simple example:
const Archiver = require('archiver')
const formatResponse = function(body){
var response = {
"statusCode": 200,
"headers": {
'Content-Type': 'application/zip',
'Content-disposition': 'attachment; filename=myFile.zip',
},
"isBase64Encoded": true,
"body": body.toString('base64')
}
return response
}
exports.handler = async event => {
return promise = new Promise((resolve, reject) => {
let zip = Archiver('zip')
zip
.append('Some text to go in file 1.', { name: '1.txt' })
.append('Some text to go in file 2. I go in a folder!', {
name: 'somefolder/2.txt',
})
.file('staticFiles/3.txt', { name: '3.txt' })
.finalize()
let buffer = []
zip.on('data', (data => buffer.push(data)))
zip.on('end', ( () => {
let data = Buffer.concat(buffer);
console.log(data)
resolve(formatResponse(data))
}))
})
}
I think the issue should be closed!
@ricky11 You can use buffer to collect the stream, and then send payload encoded in base64.
Simple example:
const Archiver = require('archiver') const formatResponse = function(body){ var response = { "statusCode": 200, "headers": { 'Content-Type': 'application/zip', 'Content-disposition': 'attachment; filename=myFile.zip', }, "isBase64Encoded": true, "body": body.toString('base64') } return response } exports.handler = async event => { return promise = new Promise((resolve, reject) => { let zip = Archiver('zip') zip .append('Some text to go in file 1.', { name: '1.txt' }) .append('Some text to go in file 2. I go in a folder!', { name: 'somefolder/2.txt', }) .file('staticFiles/3.txt', { name: '3.txt' }) .finalize() let buffer = [] zip.on('data', (data => buffer.push(data))) zip.on('end', ( () => { let data = Buffer.concat(buffer); console.log(data) resolve(formatResponse(data)) })) }) }
This solution solved the purpose. Works like charm. This should be the pinned answer.
@ricky11 You can use buffer to collect the stream, and then send payload encoded in base64. Simple example:
const Archiver = require('archiver') const formatResponse = function(body){ var response = { "statusCode": 200, "headers": { 'Content-Type': 'application/zip', 'Content-disposition': 'attachment; filename=myFile.zip', }, "isBase64Encoded": true, "body": body.toString('base64') } return response } exports.handler = async event => { return promise = new Promise((resolve, reject) => { let zip = Archiver('zip') zip .append('Some text to go in file 1.', { name: '1.txt' }) .append('Some text to go in file 2. I go in a folder!', { name: 'somefolder/2.txt', }) .file('staticFiles/3.txt', { name: '3.txt' }) .finalize() let buffer = [] zip.on('data', (data => buffer.push(data))) zip.on('end', ( () => { let data = Buffer.concat(buffer); console.log(data) resolve(formatResponse(data)) })) }) }
This solution solved the purpose. Works like charm. This should be the pinned answer.
The only problem with this solution is that it's not scalable and optimized for memory usage. Tried it with 10,000 images.
For anyone struggling.... i found this gist that is well documented and has solutions for aws-sdk v2 and v3.
https://gist.github.com/amiantos/16bacc9ed742c91151fcf1a41012445e
I am however facing an issue where every child file is also zipped, rather than one single compressed file.
This code works well locally. However, it does not behave strangely in the lambda environment. No error occurs. However, it does not actually work. ( lambda ) The entry and progress events of the archiver are not raised. What is the problem?