hashicorp / terraform-provider-google

Terraform Provider for Google Cloud Platform
https://registry.terraform.io/providers/hashicorp/google/latest/docs
Mozilla Public License 2.0
2.28k stars 1.72k forks source link

google_data_loss_prevention_job_trigger - Error: Error creating JobTrigger: googleapi: Error 400: JobTrigger must contain a single schedule. #16409

Open kolban-google opened 10 months ago

kolban-google commented 10 months ago

I am trying to create a DLP Job Trigger of type manual.

I am following the documentation here:

https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/data_loss_prevention_job_trigger

When I execute Terraform apply, we get a failure as described below.

Terraform Version

terraform -v
Terraform v1.6.1
on linux_amd64
+ provider registry.terraform.io/hashicorp/google v5.3.0
+ provider registry.terraform.io/hashicorp/google-beta v5.3.0
+ provider registry.terraform.io/hashicorp/random v3.5.1

Your version of Terraform is out of date! The latest version
is 1.6.2. You can update by downloading from https://www.terraform.io/downloads.html

Affected Resource(s)

Terraform Configuration Files

resource "google_data_loss_prevention_job_trigger" "customers" {
  parent = "projects/${var.PROJECT_ID}"

  triggers {
    manual {}
  }
  inspect_job {
    storage_config {
      big_query_options {
        table_reference {
          project_id = google_bigquery_table.customers.project
          dataset_id = google_bigquery_table.customers.dataset_id
          table_id   = google_bigquery_table.customers.table_id
        }
        rows_limit    = 1000
        sample_method = "RANDOM_START"
      }
    }
  }
}

Debug Output

Terraform will perform the following actions:

  # google_data_loss_prevention_job_trigger.customers will be created
  + resource "google_data_loss_prevention_job_trigger" "customers" {
      + create_time   = (known after apply)
      + id            = (known after apply)
      + last_run_time = (known after apply)
      + name          = (known after apply)
      + parent        = "projects/kolban-dataplex-demo-10-06-02"
      + status        = "HEALTHY"
      + trigger_id    = (known after apply)
      + update_time   = (known after apply)

      + inspect_job {
          + storage_config {
              + big_query_options {
                  + rows_limit    = 1000
                  + sample_method = "RANDOM_START"

                  + table_reference {
                      + dataset_id = "customers_ds"
                      + project_id = "kolban-dataplex-demo-10-06-02"
                      + table_id   = "customers"
                    }
                }
            }
        }

      + triggers {
          + manual {}
        }
    }

Plan: 1 to add, 0 to change, 0 to destroy.

Do you want to perform these actions?
  Terraform will perform the actions described above.
  Only 'yes' will be accepted to approve.

  Enter a value: yes

google_data_loss_prevention_job_trigger.customers: Creating...
╷
│ Error: Error creating JobTrigger: googleapi: Error 400: JobTrigger must contain a single schedule.
│ 
│   with google_data_loss_prevention_job_trigger.customers,
│   on sdp.tf line 1, in resource "google_data_loss_prevention_job_trigger" "customers":
│    1: resource "google_data_loss_prevention_job_trigger" "customers" {
│ 

b/311726392

kolban-google commented 10 months ago

If I were to make an additional guess ... the core of the error is that Terraform seems to "require" me to specify a "triggers" block ... however, reading the spec of the API, I am sensing that triggers is optional ... in my environment, I seem to want to ommt triggers completely but Terraform won't allow me to.

If It helps, I am a Google employee and can be reached at LDAP kolban@google.com

melinath commented 9 months ago

It looks like the manual triggers can only be used with "hybrid" jobs, which I believe are jobs using hybrid_options - your job uses big_query_options. This is an API-side restriction.

patrickmoy commented 2 months ago

Based on the previous comment, it looks like this is just an invalid input case. Do we need to take action here?

I can try to change the validation to refuse manual triggers with non-hybrid jobs, but I'm not sure how high priority this is (as well as how easy enforcing this in magic-modules is)

melinath commented 2 months ago

Looking over this again, the documentation correctly shows a manual job that uses hybrid_options. There also haven't been a bunch of thumbs added so it's probably not in high demand. Client-side validation could help; the other option would be improving the API error message so that it more clearly states the problem & solution (https://google.aip.dev/193)