Closed gruckion closed 1 year ago
This issue has been automatically marked as stale because it has been open 30 days with no activity. Remove stale label or comment or this issue will be closed in 10 days
This issue was automatically closed because of stale in 10 days
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.
Is your request related to a problem? Please describe.
We have a lot of lambda functions which are grouped into multiple nodejs project packages based on their business domain. One project might have 3 handlers and another may have 1. This is done due to some lambdas using identical dependencies based on business domain.
We use yarn workspace
Our nodejs project makes use of Yarn workspaces which helps with grouping common lambda functions into a single package. We also have a shared code package which the lambda packages reference (this would be a good use case for lambda layers). Workspaces also allow us to share a common directory and to run a script against all the lambda packages from the root of our project, e.g. running all tests, transpiling or deploying (via Serverless Framework). We're using webpack to transpile Typescript into Javascript for runtime
As such we can't run
npm install
and must useyarn install
.Describe the solution you'd like.
How can we use this package with yarn
How can we have multiple lambda handlers in a single npm project which gets packaged up as multiple zip files?
How can we run these lambda functions locally for testing and how can mocking of DynamoDB be handled in this context?
Describe alternatives you've considered.
We are considering
npm_requirements = false, then providing
commands = []to trigger the transpile step from typescript to javascript and have it produce multiple
handler.jsfiles and then loading each into the
path` variable.There's an issue where than each usage of the module will need to produce the package zip for its respective handler. But each use will be running the transpile command to get the ALL individual handlers but only one will be zipped. This is a little wasteful, so looks like we will need ensure the transpile step can single out a specific handler of interest.
Additional context
I'm about to setup AWS StepFunctions to call lambda functions. The stepfunction module needs the lambda ARNs and our Lambda project needs a reference to the stepfunction ARN. So we have this chicken and egg problem. In order to deploy terraform we first need to create our lambdas (which is currently done using Serverless Framework). But the lambdas are reading from parameter store to get a reference to the step function ARN.
The only way we can do this at the moment is to temporarily remove the environment variables loaded from ssm parameter store, deploy the lambdas then apply Terraform infrastructure. Then deploy the serverless lambda functions a second time with the environment variables via ssm parameter store added back in. It's dirty.
Pros and cons
SFN lambdas in serverless
PRO:
CON:
SFN lambdas in Terraform
PRO:
CON: