Closed pingzh closed 2 years ago
the root cause should be https://github.com/apache/airflow/blob/749e53def43055225a2e5d09596af7821d91b4ac/airflow/cli/commands/task_command.py#L106
it is very strange. in the airflow tasks run --local
process, the dag_run can be correctly loaded but in the airflow tasks run --raw
process, it returns None
The session
was not configed correctly. This is why it always complains DagRunNotFound
:
looks like the bug is introduced in this pr https://github.com/apache/airflow/pull/22284.
my airflow.cfg
does not have [database]
section, but i have sql_alchemy_conn
under core
. so when the StandardTaskRunner creates the tmp cfg file, sql_alchemy_conn
is in both [database]
and [core]
. the value of sql_alchemy_conn
in [database]
is the sqlite
, while sql_alchemy_conn
in [core]
has the my intended value, (mysql). then sql_alchemy_conn
is get from [database]
, which is why it is sqlite
instead of mysql
AAARGH.
Good diagnosis @pingzh . Confirmed!
thanks @potiuk for quickly getting this fixed 👍
Apache Airflow version
main (development)
What happened
trying to run airflow tasks run command locally and force
StandardTaskRunner
to use_start_by_exec
instead of_start_by_fork
However, it always errors out:
see https://user-images.githubusercontent.com/8662365/168164336-a75bfac8-cb59-43a9-b9f3-0c345c5da79f.png
i have checked the dag_run does exist in my db:
What you think should happen instead
No response
How to reproduce
pull the latest main branch with this commit:
7277122ae62305de19ceef33607f09cf030a3cd4
run airflow scheduler, webserver and worker locally with
CeleryExecutor
.Operating System
Apple M1 Max, version: 12.2
Versions of Apache Airflow Providers
NA
Deployment
Other
Deployment details
on my local mac with latest main branch, latest commit:
7277122ae62305de19ceef33607f09cf030a3cd4
Anything else
Python version:
Python 3.9.7
Are you willing to submit PR?
Code of Conduct