hashicorp / terraform-provider-google

Terraform Provider for Google Cloud Platform
https://registry.terraform.io/providers/hashicorp/google/latest/docs
Mozilla Public License 2.0
2.28k stars 1.72k forks source link

google_dataflow_job - Cannot set autoscalingAlgorithm to `THROUGHPUT_BASED` #18001

Open 0xNFANZ opened 4 months ago

0xNFANZ commented 4 months ago

Community Note

Terraform Version

Terraform v1.8.2 on darwin_arm64

Affected Resource(s)

google_dataflow_job

Terraform Configuration

resource "google_dataflow_job" "dataflow_job" {
  project                      = var.project_id
  name                         = "dataflow-job"
  template_gcs_path            = "gs://dataflow-templates/2024-03-27-00_RC00/Cloud_PubSub_to_Splunk"
  temp_gcs_location            = "XXX"
  service_account_email        = var.service_account_email
  machine_type                 = var.machine_type
  max_workers                  = var.max_workers
  zone                         = var.zone
  skip_wait_on_job_termination = true

  parameters = {
    autoscalingAlgorithm                         = "THROUGHPUT_BASED"
    inputSubscription                            = var.pubsub_subscription_id
    outputDeadletterTopic                        = google_pubsub_topic.gcloud_dataflow_deadletter_pubsub_topic.id
    parallelism                                  = var.max_workers * local.vCPUs * 2
    url                                          = "URL"
    batchCount                                   = 50
    includePubsubMessage                         = "true"
    disableCertificateValidation                 = "sure"
    enableBatchLogs                              = true
    enableGzipHttpCompression                    = true
    tokenSource                                  = "SECRET_MANAGER"
    tokenSecretId                                = "XXX"
    javascriptTextTransformGcsPath               = "gs://bucket/file.js"
    javascriptTextTransformFunctionName          = "function_name"
    javascriptTextTransformReloadIntervalMinutes = 15
  }
  region                 = var.region
  subnetwork             = var.subnetwork
  network                = var.network
  ip_configuration       = "WORKER_IP_PRIVATE"
  additional_experiments = ["min_num_workers=${var.min_workers}"]

  lifecycle {
    ignore_changes = [
      additional_experiments # Ignore default experiments that may be added by Dataflow templates API
    ]

    replace_triggered_by = [
      terraform_data.topic_replacement,
      terraform_data.subscription_replacement,
      google_pubsub_topic.gcloud_dataflow_deadletter_pubsub_topic,
      google_pubsub_subscription.gcloud_dataflow_deadletter_pubsub_sub
    ]
  }
}

Debug Output

No response

Expected Behavior

Created new job with autoscalingAlgorithm equal to THROUGHPUT_BASED

Actual Behavior

It throws error of incompatible error:

│ Error: googleapi: Error 400: The template parameters are invalid.
│ Details:
│ [
│   {
│     "@type": "type.googleapis.com/google.dataflow.v1beta3.InvalidTemplateParameters",
│     "parameterViolations": [
│       {
│         "description": "Unrecognized parameter",
│         "parameter": "autoscalingAlgorithm"
│       }
│     ]
│   }
│ ]
│ , badRequest
│ 
│   with module.zafin_anzplus_dataflow_job[0].google_dataflow_job.dataflow_job,
│   on ../../modules/logging-pipeline/pipeline.tf line 24, in resource "google_dataflow_job" "dataflow_job":

Steps to reproduce

  1. terraform apply

Important Factoids

References

17570 - fix was implemented for google_dataflow_flex_template_job, but not google_dataflow_job

b/339853870

ggtisc commented 4 months ago

Confirmed issue!

After run terraform apply it returns an error 400: The template parameters are invalid