arithmetric / aws-lambda-ses-forwarder

Serverless email forwarding using AWS Lambda and SES
MIT License
1.7k stars 450 forks source link

node 18.x support #144

Open clarson opened 1 year ago

clarson commented 1 year ago

Hi,

This script does not work when lambda is configured to run node 18. AWS lambda node 18 is using AWS SDK v3 and the require statement no longer works.

More information here: https://aws.amazon.com/blogs/compute/node-js-18-x-runtime-now-available-in-aws-lambda/

Thanks, Chuck

ellis-sutehall commented 1 year ago

I'm having the same problem. I had to update my script and SES forced my to upgrade the node version and now it doesn't work.

Were you able to fix it?

clarson commented 1 year ago

I was able to go back to node 16 which is still supported. I just checked before this reply.

mylesboone commented 11 months ago

Here's code that will run on Node 18.x and uses SES v1 client:

https://gist.github.com/mylesboone/b6113f8dd74617d27f54e5d0b8598ff7

Minjun-KANG commented 11 months ago

Here's code that will run on Node 18.x and uses SES v1 client:

https://gist.github.com/mylesboone/b6113f8dd74617d27f54e5d0b8598ff7

error: PermanentRedirect: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.

You may encounter the above error message. In this case, you can set the region in S3 as shown below. Thank you for sharing the code.

s3: overrides && overrides.s3 ? overrides.s3 : new S3Client({signatureVersion: 'v4', region: 'ap-northeast-2'})

arithmetric commented 11 months ago

Thanks for this report! I've updated the documenation to recommend using the Node.js 16 runtime for now: https://github.com/arithmetric/aws-lambda-ses-forwarder/commit/67bc86b44823099c838a82a499ff095d508737a8

However I also found that the code works in Node.js 18 as is if you install the AWS SDK v2 module and its dependencies.

To do this, change the following step in the documented Set Up process:

For the Lambda function code, either copy and paste the contents of index.js into the inline code editor or zip the contents of the repository and upload them directly or via S3.

To do the following instead:

  1. In a copy of this repository, run npm i --omit=dev
  2. Create a .zip file with the contents of this repository. Note: index.js should be at the root of the zip file and not in a directory.
  3. Upload the zip file to Lambda.

Let me know if this approach works for you.

lvn1 commented 9 months ago

Here's code that will run on Node 18.x and uses SES v1 client:

https://gist.github.com/mylesboone/b6113f8dd74617d27f54e5d0b8598ff7

I managed to get it working in Node.js 18 by modifying the gist and only using the lambda code editor, without installing any additional dependencies.

CloudWatch errors were showing:

I just updated the import statements, changed the 5 exports to consts and changed the exports.handler to export const handler.

https://gist.github.com/lvn1/9814d301bead8e0ecfc79ba9efe64b4a

hacnet commented 4 months ago

Here's code that will run on Node 18.x and uses SES v1 client: https://gist.github.com/mylesboone/b6113f8dd74617d27f54e5d0b8598ff7

I managed to get it working in Node.js 18 by modifying the gist and only using the lambda code editor, without installing any additional dependencies.

CloudWatch errors were showing:

  • require is not defined in ES module scope, you can use import instead
  • exports is not defined in ES module scope

I just updated the import statements, changed the 5 exports to consts and changed the exports.handler to export const handler.

https://gist.github.com/lvn1/9814d301bead8e0ecfc79ba9efe64b4a

The 2nd gist presumably needs to have line 3-4 replaced with that of the 1st link.

It then still fails where export const handler is unexpected and/or related steps need further modification.

mylesboone commented 4 months ago

@hacnet I'd suggest using https://github.com/arithmetric/aws-lambda-ses-forwarder/pull/147

hacnet commented 4 months ago

@hacnet I'd suggest using #147

That fixed it, many thanks!