hashicorp / terraform-provider-google

Terraform Provider for Google Cloud Platform
https://registry.terraform.io/providers/hashicorp/google/latest/docs
Mozilla Public License 2.0
2.29k stars 1.72k forks source link

Please add support for source_code_hash in google_cloudfunctions_function #3793

Open mcapts opened 5 years ago

mcapts commented 5 years ago

Community Note

Description

The google_cloudfunctions_function resource will not currently update if you change your source code. The current steps I use are:

  1. Render my .py file as a template_file
  2. Zip this as an archive_file
  3. Upload this .zip to GCS via google_storage_bucket_object
  4. Use the bucket/object name from step 3 when creating a google_cloudfunctions_function This is similar to: https://www.terraform.io/docs/providers/google/r/cloudfunctions_function.html

If I update my .py, the object zip gets re-uploaded to GCS, but the function does not update. AWS Lambda has support for source_code_hash, which solves this: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_function#source_code_hash

New or Affected Resource(s)

Potential Terraform Configuration

# Propose what you think the configuration to take advantage of this feature should look like.
# We may not use it verbatim, but it's helpful in understanding your intent.

References

thommahoney commented 3 years ago

Is there a common pattern that people use as a workaround to this missing functionality?

iennae commented 3 years ago

One workaround that I've seen implemented is using an MD5 hash of the zipped file as the filename. Diarmuid Mac Namara wrote this pattern up here https://diarmuid.ie/blog/setting-up-a-recurring-google-cloud-function-with-terraform. I don't know how common this pattern is.

thommahoney commented 3 years ago

I ended up working around this using the following pattern:

data "archive_file" "some_archive" {
  type        = "zip"
  source_dir  = "..."
  output_path = "..."
}

resource "google_storage_bucket_object" "some_object" {
  name   = "${basename(data.archive_file.some_archive.output_path)}-${data.archive_file.some_archive.output_sha}"
  source = data.archive_file.some_archive.output_path
  ...
}

resource "google_cloudfunctions_function" "function" {
  source_archive_object = google_storage_bucket_object.some_object.name
  ...
}

By naming the object based on the shasum of the archive, it's possible to only update the object & function when the source changes.

melinath commented 2 years ago

It looks like the aws source_code_hash field is an API field. We might be able to make a local field that's only used for change detection but I'm not sure that's something we'd want to do.