hashicorp / terraform-provider-google

Terraform Provider for Google Cloud Platform
https://registry.terraform.io/providers/hashicorp/google/latest/docs
Mozilla Public License 2.0
2.36k stars 1.75k forks source link

Bug in google_storage_transfer_job & google_storage_transfer_project_service_account #10798

Closed faizan-ahmad-db closed 6 months ago

faizan-ahmad-db commented 2 years ago

Community Note

Terraform Version

terraform v1.0.11 google provider version 4.5.0

Affected Resource(s)

google_storage_transfer_job google_storage_transfer_project_service_account

Terraform Configuration Files

data "google_storage_transfer_project_service_account" "default" {

     project    = “<my_project_ID>”
}

output "default_account" {
  value = data.google_storage_transfer_project_service_account.default.email
}

Expected Behavior

the config file should fetch the default storage transfer Job SA of my project.

Actual Behavior

But it is trying pick the default storage transfer Job SA of provider project (where our Terraform cloud is hosted)

This is the same case for google_storage_transfer_job. Eventhough, we clearly mentioned the project ID in the config file. It is trying to create the transfer job in provider project(where TF cloud is hosted)

Getting below error during TF plan execution

Error: Error when reading or editing Google Cloud Storage Transfer service account not found: googleapi: Error 403: Storage Transfer API has not been used in project xxxxxxxxxxx before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/storagetransfer.googleapis.com/overview?project=xxxxxxxxxxx then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry. Details:

[
{
"@type": "type.googleapis.com/google.rpc.Help",
"links": [

{ "description": "Google developers console API activation", "url": "https://console.developers.google.com/apis/api/storagetransfer.googleapis.com/overview?project=xxxxxxxxxxx" }
]
},
{
"@type": "type.googleapis.com/google.rpc.ErrorInfo",
"domain": "googleapis.com",
"metadata":

{ "consumer": "projects/xxxxxxxxxxx", "service": "storagetransfer.googleapis.com" }
,
"reason": "SERVICE_DISABLED"
}
]
, accessNotConfigured

b/302673113

unki commented 2 years ago

@faizan-ahmad-db was hitting this issue too, but apparently it's works-as-designed

You need to enable the storage-transfer API in both projects:

ambeshsingh commented 2 years ago

@unki regarding your statement to enable the storage-transfer API in both projects, I have created Composer, Dataflow, and many other services without enabling their respective APIs in the project which homes the service-account. Their APIs are enabled only in the project where these resources are being created. Also, I can run Terraform from my local machine right with my user credentials. Only when I create a transfer service is when I am facing this issue where it asks me to enable the storage-transfer-api in the home project.

ghost commented 1 year ago

Any news on this? This continues happening. Thanks! :)

calum-github commented 9 months ago

Still seeing this on version 5.16 of the google TF provider - this is silly. If you pass the project ID to the data source it should return the service agent FOR THE PROJECT THAT YOU HAVE PASSED IN - no ifs no buts

googlyrahman commented 7 months ago

Hi, We're unable to reproduce this error on our system, i've tried this config on my system

terraform {
  required_providers {
    google = {
      source = "hashicorp/google"
      version = "4.51.0"
    }
  }
}

data "google_storage_transfer_project_service_account" "default" {
     project = "seventhsky"
}

output "default_account" {
  value = data.google_storage_transfer_project_service_account.default.email
}

and it works fine, when using terraform plan, is there anything which i'm missing here?

vgelot commented 7 months ago

Hi, I also have the same issue, (tried with same input as @googlyrahman but changing the project by 1 of my project). I tried on my laptop and also on cloud shell. Both failed, but with different consumer project numbers.

And by looking at those project numbers, they don't belong to us, maybe those projects belong to Google?

arya-harness commented 6 months ago

Any news on this, I am facing the same issue. I have an entity that creates tf resources, it sits on project A and creates resource on project B. I have not seen any other terraform resource with such behaviour

SarahFrench commented 6 months ago

@googlyrahman I've been able to reproduce the problem, but it requires some setup. For what it's worth I believe the problem is a misunderstanding of when and where an API needs to be enabled, and isn't a bug in the provider. I can expand on this but: using a debugger I've seen thatproject values provided as arguments in the google_storage_transfer_job resource and google_storage_transfer_project_service_account data source blocks in Terraform config are used by the code and override the provider default project.

Ok, onto the reproduction of the error observed in this issue :

You need 2 projects. Project A which has the storagetransfer.googleapis.com API disabled and Project B where the storagetransfer.googleapis.com API is enabled (and so the storage transfer account will exist).

Project A is where the service account exists that Terraform will use as its identity when interacting with Google APIs. Make a service account and a JSON key file to use to configure the Google provider with.

Project B is where we'll be either trying to read storagetransfer-related data from, or create resources in.

You can give the service account from Project A some project-level permissions in Project B (e.g. make them project Owner, seeing as this is just a bug reproduction and everything will be deleted after) but from what I've seen the error disrupts the process before permissions become relevant.

terraform {
  required_providers {
    google = {
      source = "hashicorp/google"
      version = "5.30.0"
    }
  }
}

provider "google" {
  credentials = "./path/to/keyfile/for/service-account/in/project-A.json"
  project = "project-C" // this provider default isn't used when doing plan/apply with this config - feel free to check
}

data "google_storage_transfer_project_service_account" "default" {
     project = "project-B"
}

The result is:

Error: Error when reading or editing Google Cloud Storage Transfer service account not found:
googleapi: Error 403: Storage Transfer API has not been used in project <PROJECT NUMBER CORRESPONDING TO PROJECT A> before or it is disabled.
Enable it by visiting https://console.developers.google.com/apis/api/storagetransfer.googleapis.com/overview?project= <PROJECT NUMBER CORRESPONDING TO PROJECT A>  then retry.
If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.

Note that the error is reporting that the Storage Transfer API isn't available for use in project A.

When I set TF_LOG=DEBUG as an environment variable I can see that the data source is attempting to read data from the project set in the configuration:

---[ REQUEST ]---------------------------------------
GET /v1/googleServiceAccounts/<PROJECT_B_ID>?alt=json&prettyPrint=false HTTP/1.1
Host: storagetransfer.googleapis.com
User-Agent: google-api-go-client/0.5 Terraform/1.8.0-dev (+https://www.terraform.io) Terraform-Plugin-SDK/2.33.0 terraform-provider-google/dev
X-Goog-Api-Client: gl-go/1.21.3 gdcl/0.177.0
Accept-Encoding: gzip

Given this information, I can see that there isn't a problem with the google_storage_transfer_project_service_account data source attempting to read data from the wrong project. Instead my mental model is that because the service account identity that Terraform is authenticated as is in project A, any API calls need to be made through APIs that are enabled in project A. I feel this is supported by how I can trigger the same problem using google_bigquery_default_service_account too, as long as the BigQuery API is not enabled in project A, and also how this Dialogflow documentation describes how both consumer and resource projects both need the Dialogflow API to be enabled when using service accounts in separate projects to manade Dialogflow resources.

The problem appears to be that people believe the API only needs to be enabled where storage transfer resources are being provisioned.

@googlyrahman sorry for the looong comment. I see from your profile that you work at Google- I'm at HashiCorp so I'm not able to be active on the internal ticket linked to this issue. Could you please comment on how correct my mental model of the problem is, and whether there are any solutions other than ensuring APIs are enabled in the GCP project that contains the service account used by Terraform? Thanks!

googlyrahman commented 6 months ago

That's correct, with the mentioned step above - I'm able to reproduce this error. To summarize the above comment:

We would be needing minimum of two projects to reproduce this error - Let's call them Project A, and Project B, Use a service account of Project A [let's call it proj_a_service_account] that've access to Project B. Given STS API isn't enabled in Project A, and it's enabled in Project B, if proj_a_service_Account is used to access transfer related stuff of Project B, it would throw error.

In this case, both of projects should've STS API enabled, if any of project have not STS API enabled, it would throw the error, so the only solution here is to enable them at both the places.

Thanks @SarahFrench for writing such a detailed comment!

SarahFrench commented 6 months ago

Just to add to what we've described above, it could be complicated some more by using user_project_override=true when configuring the provider. I'd need to experiment some more to check as I'm unsure.

SarahFrench commented 6 months ago

Just to add to what we've described above, it could be complicated some more by using user_project_override=true when configuring the provider. I'd need to experiment some more to check as I'm unsure.

I've not been able to find a different outcome when using user_project_override=true.


I'm closing this GitHub issue because:

I recommend that users enable the storagetransfer.googleapis.com API in the project containing the service account they use to authenticate Terraform to address this issue.

Note: Enabling the Storage Transfer API can be achieved using the google_project_service resource, however there are a few pitfalls with that resource too. Ensure that the project containing your service account has the Service Usage API enabled, as use of google_project_service depends on that API being enabled. For further information please see this guide.

github-actions[bot] commented 5 months ago

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.