terraform-aws-modules / terraform-aws-lambda

Terraform module, which takes care of a lot of AWS Lambda/serverless tasks (build dependencies, packages, updates, deployments) in countless combinations 🇺🇦
https://registry.terraform.io/modules/terraform-aws-modules/lambda/aws
Apache License 2.0
886 stars 657 forks source link

Error on CI [Error: local-exec provisioner error] #428

Closed lpossamai closed 1 year ago

lpossamai commented 1 year ago

Description

I have the same code deployed to different accounts, using Github Actions. I've implemented the solution described here, but am getting the following error, as Github cannot find the package.

##[debug]│ Error: local-exec provisioner error
##[debug]│ 
##[debug]│   with module.lambda_function.null_resource.archive[0],
##[debug]│   on .terraform/modules/lambda_function/package.tf line 67, in resource "null_resource" "archive":
##[debug]│   67:   provisioner "local-exec" {
##[debug]│ 
##[debug]│ Error running command
##[debug]│ './.terraform/lambda-builds/package_dir/NotificationHandler-dev/dev/aeb134742ebf21621c1345f9560df33ade546d008e322876157bea4e1ebdf5f7.plan.json':
##[debug]│ exit status 1. Output: Traceback (most recent call last):
##[debug]│   File "/home/runner/work/infrastructure/infrastructure/terraform/push-notifications/.terraform/modules/lambda_function/package.py", line 1627, in <module>
##[debug]│     main()
##[debug]│   File "/home/runner/work/infrastructure/infrastructure/terraform/push-notifications/.terraform/modules/lambda_function/package.py", line 1623, in main
##[debug]│     exit(args.command(args))
##[debug]│   File "/home/runner/work/infrastructure/infrastructure/terraform/push-notifications/.terraform/modules/lambda_function/package.py", line 1494, in build_command
##[debug]│     with open(args.build_plan_file) as f:
##[debug]│ FileNotFoundError: [Errno 2] No such file or directory:
##[debug]│ './.terraform/lambda-builds/package_dir/NotificationHandler-dev/dev/aeb134742ebf21621c1345f9560df33ade546d008e322876157bea4e1ebdf5f7.plan.json'
##[debug]│ 
##[debug]╵
##[debug]
##[debug]exitcode: 1

Versions

Reproduction Code [Required]

module "lambda_function" {
  source  = "terraform-aws-modules/lambda/aws"
  version = "v4.10.1"

  function_name = local.function_name
  description   = "Client Email/SMS reminder function"
  handler       = "app.handler"
  runtime       = "nodejs16.x"
  publish       = false

  source_path = "./src/nodejs-app"

  store_on_s3               = true
  s3_bucket                 = module.s3_bucket_lambda.s3_bucket_id
  s3_server_side_encryption = "aws:kms"

  create_async_event_config      = true
  reserved_concurrent_executions = 1
  timeout                        = 900 # 15 minutes

  artifacts_dir = "${path.root}/.terraform/lambda-builds/package_dir/${local.function_name}/${terraform.workspace}"

My pipeline runs terraform plan, uploads the plan file to Github and in another moment, when the PR is merged, Github downloads that plan file and runs terraform apply.

I don't know if that process is the reason I see this error: ##[debug]module.lambda_function.null_resource.archive[0] (local-exec): FileNotFoundError: [Errno 2] No such file or directory: './.terraform/lambda-builds/package_dir/NotificationHandler-dev/dev/aeb134742ebf21621c1345f9560df33ade546d008e322876157bea4e1ebdf5f7.plan.json'?

MichielBijland commented 1 year ago

@lpossamai had the same issue, fixed by storing builds folder on plan step and restore it on apply step.

github-actions[bot] commented 1 year ago

This issue has been automatically marked as stale because it has been open 30 days with no activity. Remove stale label or comment or this issue will be closed in 10 days

atefhaloui commented 1 year ago

I have the same issue with my gitlab project containing a lambda + 3 layers. I'm using the standard template: https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/ci/templates/Terraform.gitlab-ci.yml

atefhaloui commented 1 year ago

I have the same issue with my gitlab project containing a lambda + 3 layers. I'm using the standard template: https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/ci/templates/Terraform.gitlab-ci.yml

My workaround: use the hashicorp/archive provider and set the packages directory where archives are stored as a cache in my .gitlab-ci.yml:

cache:
  key: "${TF_ROOT}"
  paths:
  - "${TF_ROOT}/.terraform/"
  - "${TF_ROOT}/packages/"
locals {
  package_dir  = "${path.cwd}/packages"
}

data "archive_file" "main" {
  type             = "zip"
  source_file      = "${path.cwd}/src/index.py"
  output_file_mode = "0666"
  output_path      = "${local.package_dir}/main.zip"
}

module "lambda_function" {
  source  = "terraform-aws-modules/lambda/aws"
  version = "~> v4.16.0"

  create_package          = false
  ignore_source_code_hash = false

  handler                = "index.lambda_handler"
  runtime                = "python3.7"
  local_existing_package = "${local.package_dir}/main.zip"
...
}
lpossamai commented 1 year ago

@lpossamai had the same issue, fixed by storing builds folder on plan step and restore it on apply step.

Hi @MichielBijland Are you able to share some of the solution here, please? The build file the lambda creates is a .zip file but the name is random, so it's hard to specify it using the actions/upload-artifact@v3 action.

MichielBijland commented 1 year ago

@lpossamai Sure, we use an s3 bucket to store plans and build artefacts using merge requests flows.

plan job

- name: Store terraform plan
        run: aws s3 cp --sse AES256 --recursive --exclude '*' --include "deployments/*/terraform.plan" --include "deployments/*/builds/*.plan.json" . s3://${{ env.AWS_PLAN_BUCKET }}/plans/${{ github.event.pull_request.number }}/

apply job

- name: Retrieve terraform plan
        run: aws s3 cp --sse AES256 --recursive s3://${{ env.AWS_PLAN_BUCKET }}/plans/${{ github.event.pull_request.number }}/ .

- name apply terraform

- name: Cleanup terraform plan
        run: aws s3 rm --recursive s3://${{ env.AWS_PLAN_BUCKET }}/plans/${{ github.event.pull_request.number }}/

Paths might be different in you own workflow

github-actions[bot] commented 1 year ago

This issue has been automatically marked as stale because it has been open 30 days with no activity. Remove stale label or comment or this issue will be closed in 10 days

github-actions[bot] commented 1 year ago

This issue was automatically closed because of stale in 10 days

github-actions[bot] commented 12 months ago

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.