Open aviramst opened 7 months ago
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval.
Looks looks like it affect in case of deferrable mode, CloudStorageTransferServiceCreateJobsTrigger
do not have such an option to provide connection id
Hello! Now I am investigating this issue and then I will try to prepare a fix for this.
Assigned to you @korolkevich :)
The remaining task on this issue is explained in: https://github.com/apache/airflow/pull/37518#discussion_r1541078941
Apache Airflow Provider(s)
google
Versions of Apache Airflow Providers
apache-airflow-providers-google==10.9.0
Apache Airflow version
2.7.2
Operating System
Linux
Deployment
Amazon (AWS) MWAA
Deployment details
No response
What happened
When executing S3ToGCSOperator, it creates the data transfer job successfully but fails to get the job status because it looks for GCP default credentials rather than using the provided
gcp_conn_id
.What you think should happen instead
I think that S3ToGCSOperator should create the storage API object with the GCP credentials provided (
gcp_conn_id
)How to reproduce
Anything else
No response
Are you willing to submit PR?
Code of Conduct