aspecto-io / sns-sqs-big-payload

Amazon SNS/SQS client library that enables sending and receiving messages with payload larger than 256KiB via Amazon S3.
Apache License 2.0
50 stars 17 forks source link

SqsConsumer with Lambda didn't work #17

Open lesimoes opened 3 years ago

lesimoes commented 3 years ago

Hello, I tried use package with my lambda functions, for SqsProducer works fine, it send a payload json to S3 Bucket, but my problem is when I need to consume a message that was sent to S3. I tried like this example: https://github.com/aspecto-io/sns-sqs-big-payload/blob/HEAD/docs/usage-in-lambda.md

But never runs the handleMessage function and always returns undefined response. The transformMessageBody runs but with body param undefined too.

My function getMessage should return a message payload.

  async getMessage (messageProps) {
    const sqsConsumer = SqsConsumer.create({
      region: 'us-east-1',
      getPayloadFromS3: true,
      s3Bucket: 'MY-S3-BUCKET',
      transformMessageBody: (body) => {
        console.log('transformMessageBody', body)
        const snsMessage = JSON.parse(body);
        return snsMessage.Message;
      },
      parsePayload: (raw) => JSON.parse(raw),
      handleMessage: async ({ payload }) => {
        console.log('handleMessage', payload)
      },
    });

    const result = await sqsConsumer.processMessage(messageProps);
    return result;
  }

This is my entire message payload

{
    "Records": [
        {
            "messageId": "238bba99-76e3-4fe2-b777-****",
            "receiptHandle": "BLABLABAL",
            "body": "{\"S3Payload\":{\"Id\":\"a51bed3c-e0d2-45f1-a37a-***\",\"Bucket\":\MY-S3-BUCKET\",\"Key\":\"a51bed3c-e0d2-45f1-a37a-*****.json\",\"Location\":\"https://MY-S3-BUCKET.s3.amazonaws.com/a51bed3c-e0d2-45f1-a37a-****.json\"}}",
            "attributes": {
                "ApproximateReceiveCount": "8",
                "SentTimestamp": "1604921763468",
                "SenderId": "AROA4VG6N6C5XCF3YA7S3:BABALBALBAL",
                "ApproximateFirstReceiveTimestamp": "1604921763468"
            },
            "messageAttributes": {},
            "md5OfBody": "4a125225a9e7016440****",
            "eventSource": "aws:sqs",
            "eventSourceARN": "arn:aws:sqs:us-east-1:*****:queueName",
            "awsRegion": "us-east-1"
        }
    ]
}

I send the first object of array Records, like this.

eventBody = await this.queueService.getMessage(event.Records[0]);

I really believe that I missed something but I dont know where. Could you help me?

Thanks.

mzahor commented 3 years ago

Yes, the lambda mode is broken atm, I've linked an issue with the details.

lesimoes commented 3 years ago

@mzahor, recently I used your ideia shared in that post to build my own solution. The main ideia is all yours and works very well with lambda.

Thank you for your post.

Ps.: If you want I can share my solution with you, just let me know.

funkel1989 commented 3 years ago

@lesimoes Please share your solution

lesimoes commented 3 years ago

@funkel1989 I did create another package because my solution doen't use same approach (consumer).

sqs-huge-message

danrivett commented 2 years ago

Thanks all for creating solutions for this problem, it's been very helpful for me to understand the different options there are for sending and receiving large SQS messages.

I see @funkel1989 has since created Battle-Line-Productions/sqs-large-payload-nodejs which looks good for my use case since it separates the processing of the sent SQS message from the receiving, so I can wire it into an SQS-triggered lambda which delivers the message to the lambda.

Just thought I'd mention it here in case as seems somewhat related to this issue and may serve as an inspiration for this ticket and I see the above repo's author (@funkel1989) in the comments above. If you have any other relevant info Michael, please feel free to comment.