Open mcapts opened 5 years ago
Is there a common pattern that people use as a workaround to this missing functionality?
One workaround that I've seen implemented is using an MD5 hash of the zipped file as the filename. Diarmuid Mac Namara wrote this pattern up here https://diarmuid.ie/blog/setting-up-a-recurring-google-cloud-function-with-terraform. I don't know how common this pattern is.
I ended up working around this using the following pattern:
data "archive_file" "some_archive" {
type = "zip"
source_dir = "..."
output_path = "..."
}
resource "google_storage_bucket_object" "some_object" {
name = "${basename(data.archive_file.some_archive.output_path)}-${data.archive_file.some_archive.output_sha}"
source = data.archive_file.some_archive.output_path
...
}
resource "google_cloudfunctions_function" "function" {
source_archive_object = google_storage_bucket_object.some_object.name
...
}
By naming the object based on the shasum of the archive, it's possible to only update the object & function when the source changes.
It looks like the aws source_code_hash field is an API field. We might be able to make a local field that's only used for change detection but I'm not sure that's something we'd want to do.
Community Note
Description
The google_cloudfunctions_function resource will not currently update if you change your source code. The current steps I use are:
If I update my .py, the object zip gets re-uploaded to GCS, but the function does not update. AWS Lambda has support for source_code_hash, which solves this: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_function#source_code_hash
New or Affected Resource(s)
Potential Terraform Configuration
References