Closed MISSEY closed 2 years ago
Thanks for opening your first issue here! Be sure to follow the issue template!
Something’s wrong with your deployment. TaskFail removed execution_date
in 2.3, but a thing is still trying to write that column. Airflow does not do this (in 2.3), so it’s something else.
Looks like your database is from old airllow version. I believe your migration job did not run. Likely you have installed upgraded airlfow in some way that provented migration from happening or you killed the job or smth. You need to make sure the migration job runs and this usually happens during instalation. You can look up several issues here where in some K8S dpeloyments using --wait
caused problems as --wait
does not allow the migration job to run and you should likely reinstall aiflow from scratch. This should be no problem as long as your database is remote, you should be able to nuke and reinstall airflow. Then you can use airflow db
commands to verify if migrations have been applied correctly - see migrations here (https://airflow.apache.org/docs/apache-airflow/stable/migrations-ref.html) and you should be able to verify if the migrations are applied by looking into alembic/DB migration status.
However If you somehow messed with your DB and manually modified it (which is likely) or if you broke migrations in the middle at some point, the only way to recover is to restore the DB to the moment it was not broken or possibly recreate the DB from scratch.
Also it is not helpful at all to see the whole of your configuration, hard to say what/how/when was modified. You are the only person to know what changes have been applied in what sequence.
Also some other potential customisations (plugins/settings etc) that might cause the problem. In case you have some modifications you applied, and any custom code added, I suggest to make a "bisect" approach - -start from plain Airlfow chart, image, no code of yours added except simple DAGs, see that it works and add your customisations one-by-one. We are pretty sure stock airflow does not have the problem you have, so some customisation of yours could have solved it.
Convert it into discussion until you do some experimenting witth the above.
Apache Airflow version
Other Airflow 2 version
What happened
I installed the airflow using helm chart 1.6.0, changed values of default airflow : 2.3.0-python3.8. When the dags failed, inside logs the error is
What you think should happen instead
It should display me the error that have happened inside the dags, since there is a column name 'execution date' is missing, the error couldn't be dump in
task_fail
table.How to reproduce
Custom Helm Values
Operating System
Using Kubernetes operator
Versions of Apache Airflow Providers
Airflow image : 2.3.0-python3.8
Deployment
Official Apache Airflow Helm Chart
Deployment details
Chart Version
1.6.0
Kubernetes Version
Helm Version
Anything else
No response
Are you willing to submit PR?
Code of Conduct