Closed eladshabi closed 3 years ago
Thanks for opening your first issue here! Be sure to follow the issue template!
You are using an old Airflow version with an old version of the operator. It's possible that the issue has been resolved since. Can you please check if the issue is reproducable with latest operator version in google provider?
You are using an old Airflow version with an old version of the operator. It's possible that the issue has been resolved since. Can you please check if the issue is reproducable with latest operator version in google provider?
@eladkal The issue is hard to reproduce since we will need to run a 1 hour BQ query and it depends on the allocated BQ slot. Normally, the query takes 20 minutes. I'll try to reproduce it and will update this thread.
Moreover, I've tried to find any bug fix on newer versions, but I didn't find any.
@eladkal I've reproduced the issue on the Biguqery operator by running a loop on the SQL[1] for 80 minutes.
On version 1.9.0 - the task was failed with the same error.
On version 1.10.15 - the task ran successfully without any errors.
Looks like this bug fixed in a newer airflow version.
Thanks!
[1]:
DECLARE x timestamp;
set x = CURRENT_TIMESTAMP();
LOOP
IF TIMESTAMP_DIFF(CURRENT_TIMESTAMP(), x, MINUTE) >= 80 THEN
LEAVE;
END IF;
END LOOP;
Hi,
When using the Bigquery operator on Cloud Composer, and the query takes more than 1 hour, we get an "[Errno 32] Broken pipe" after getting a 401 error.
The 401 error appeared 1 hour after the task was created, and it looks the root cause is an expired API token.
Important to note that the BQ job itself keeps running, although the airflow task failed.
Please look at relevant log parts from the moment that the job BQ job was triggered (irrelevant information was removed):
Cloud Composer version - composer-1.7.2, airflow version - 1.9.0.
Thanks