Closed ffleandro closed 1 year ago
Having the same issue 👍🏻
I have the same problem. It did not work even after specifying a new version for the s3 file. My aim is to separate the IAC and Lambda code and this problem do not allow such thing.
Make sure to specify version_id
for Lambda function to track changes there when key of S3 object is not changed but only version:
s3_existing_package = {
bucket = module.s3_bucket.s3_bucket_id
key = aws_s3_object.app_dist_zip.id
version_id = aws_s3_object.app_dist_zip.version_id # <-- this one
}
S3 bucket versioning should be enabled, too.
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.
Description
local_existing_package
updates the lambda correctly, buts3_existing_package
doesn't.In my terraform scripts I manually upload the lambda code into an s3 bucket using:
I've confirmed that the s3 object is properly updated every time the md5 of the zip file changes.
However, the lambda never updates when this object changes, even if I run a few times sequentially. Initially I thought the lambda was using the
source_hash
of the file that exists on S3 before the upload so a second attempt would fix this since now the s3 object is already online.This is my lambda example:
Versions
Module version [Required]: 4.7.2
Terraform version:
provider registry.terraform.io/hashicorp/aws v4.50.0
provider registry.terraform.io/hashicorp/external v2.2.3
provider registry.terraform.io/hashicorp/local v2.3.0
provider registry.terraform.io/hashicorp/null v3.2.1
Expected behavior
module.lambda_service.lambda_function_source_code_hash
would change everytime a new file is uploaded to the s3 bucket.Actual behavior
module.lambda_service.lambda_function_source_code_hash
doesn't change if a new file is uploaded to the s3 bucket.