I've tried using us-central1 and us-west2. BQ dataset in each location matching parameters in my config file:
GOOGLE_CLOUD_REGION=us-west2
GOOGLE_CLOUD_LOCATION=us-west2
Cloudshell is setup with us-west2
Still I receive a failure with the variant transforms docker deployment.
Error message:
ValueError: Failed to load AVRO to BigQuery table mike-kahn-sandbox.personalgenome.23andme__residual
state: DONE
job_id: /projects/mike-kahn-sandbox/jobs/eba036c3-fae5-4b9a-963c-40721be8d3a0
exception: 400 Cannot read and write in different locations: source: US, destination: us-west2.
I noticed other issues mentioned us-central1 was the only location that worked and I tried there as well. Keep getting failures with the docker deployment.
I've tried using us-central1 and us-west2. BQ dataset in each location matching parameters in my config file: GOOGLE_CLOUD_REGION=us-west2 GOOGLE_CLOUD_LOCATION=us-west2
Cloudshell is setup with us-west2 Still I receive a failure with the variant transforms docker deployment.
Error message: ValueError: Failed to load AVRO to BigQuery table mike-kahn-sandbox.personalgenome.23andme__residual state: DONE job_id: /projects/mike-kahn-sandbox/jobs/eba036c3-fae5-4b9a-963c-40721be8d3a0 exception: 400 Cannot read and write in different locations: source: US, destination: us-west2.
I noticed other issues mentioned us-central1 was the only location that worked and I tried there as well. Keep getting failures with the docker deployment.