kestra-io / plugin-gcp

Apache License 2.0
6 stars 10 forks source link

BigQuery StorageWrite tasks gives Authentication error #346

Open shrutimantri opened 6 months ago

shrutimantri commented 6 months ago

Expected Behavior

BigQuery StorageWrite task should run successfully.

Actual Behaviour

BigQuery StorageWrite task gives authentication error. Here is the error stacktrace:

2024-03-23 11:27:38.790 • Using service account: smantri-kestra-20230317@<project-id>.iam.gserviceaccount.com
2024-03-23 11:27:38.794 • Using service account: smantri-kestra-20230317@<project-id>.iam.gserviceaccount.com
2024-03-23 11:27:39.739Your default credentials were not found. To set up Application Default Credentials for your environment, see https://cloud.google.com/docs/authentication/external/set-up-adc.
2024-03-23 11:27:39.739java.io.IOException: Your default credentials were not found. To set up Application Default Credentials for your environment, see https://cloud.google.com/docs/authentication/external/set-up-adc.
    at com.google.auth.oauth2.DefaultCredentialsProvider.getDefaultCredentials(DefaultCredentialsProvider.java:127)
    at com.google.auth.oauth2.GoogleCredentials.getApplicationDefault(GoogleCredentials.java:152)
    at com.google.auth.oauth2.GoogleCredentials.getApplicationDefault(GoogleCredentials.java:124)
    at com.google.api.gax.core.GoogleCredentialsProvider.getCredentials(GoogleCredentialsProvider.java:70)
    at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:162)
    at com.google.cloud.bigquery.storage.v1.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:132)
    at com.google.cloud.bigquery.storage.v1.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:147)
    at com.google.cloud.bigquery.storage.v1.BigQueryWriteClient.<init>(BigQueryWriteClient.java:259)
    at com.google.cloud.bigquery.storage.v1.BigQueryWriteClient.create(BigQueryWriteClient.java:241)
    at com.google.cloud.bigquery.storage.v1.ConnectionWorker.<init>(ConnectionWorker.java:330)
    at com.google.cloud.bigquery.storage.v1.StreamWriter.<init>(StreamWriter.java:235)
    at com.google.cloud.bigquery.storage.v1.StreamWriter.<init>(StreamWriter.java:57)
    at com.google.cloud.bigquery.storage.v1.StreamWriter$Builder.build(StreamWriter.java:823)
    at com.google.cloud.bigquery.storage.v1.SchemaAwareStreamWriter.<init>(SchemaAwareStreamWriter.java:105)
    at com.google.cloud.bigquery.storage.v1.SchemaAwareStreamWriter.<init>(SchemaAwareStreamWriter.java:56)
    at com.google.cloud.bigquery.storage.v1.SchemaAwareStreamWriter$Builder.build(SchemaAwareStreamWriter.java:660)
    at com.google.cloud.bigquery.storage.v1.JsonStreamWriter.<init>(JsonStreamWriter.java:50)
    at com.google.cloud.bigquery.storage.v1.JsonStreamWriter.<init>(JsonStreamWriter.java:38)
    at com.google.cloud.bigquery.storage.v1.JsonStreamWriter$Builder.build(JsonStreamWriter.java:410)
    at io.kestra.plugin.gcp.bigquery.StorageWrite.run(StorageWrite.java:128)
    at io.kestra.plugin.gcp.bigquery.StorageWrite.run(StorageWrite.java:47)
    at io.kestra.core.runners.Worker$WorkerThread.run(Worker.java:710)

Other BigQuery tasks are working as expected. Also checked that my GCP service account has access to BigQuery Storage API. Screenshot attached. Screenshot 2024-03-23 at 11 31 59 AM

Steps To Reproduce

  1. Took the sample flow for BigQuery StorageWrite task.
  2. Put in the appropriate values.
  3. Execute the flow.

Environment Information

Example flow

id: bq-storage-write
namespace: dev
tasks:
  - id: http_download
    type: io.kestra.plugin.fs.http.Download
    uri: https://raw.githubusercontent.com/kestra-io/datasets/main/csv/orders.csv
  - id: "storage_write"
    type: "io.kestra.plugin.gcp.bigquery.StorageWrite"
    from: "{{ outputs.http_download.uri }}"
    projectId: <project-id>
    serviceAccount: "{{ secret('GCP_SERVICE_ACCOUNT_JSON') }}"
    destinationTable: "<project-id>.smantri_dataset.orders_new_1"
    writeStreamType: DEFAULT