puckel / docker-airflow

Docker Apache Airflow
Apache License 2.0
3.78k stars 544 forks source link

airflow CLI fails with no such table error when using docker-compose-CeleryExecutor.yml #583

Open yohei1126 opened 4 years ago

yohei1126 commented 4 years ago

overview

Steps to reproduce the error

docker-compose -f docker-compose-CeleryExecutor.yml up
> docker ps
CONTAINER ID        IMAGE                          COMMAND                  CREATED             STATUS                             PORTS                                        NAMES
0646cff9283e        puckel/docker-airflow:1.10.9   "/entrypoint.sh work…"   23 seconds ago      Up 23 seconds                      5555/tcp, 8080/tcp, 8793/tcp                 docker-airflow_worker_1
d1469456a45a        puckel/docker-airflow:1.10.9   "/entrypoint.sh sche…"   24 seconds ago      Up 23 seconds                      5555/tcp, 8080/tcp, 8793/tcp                 docker-airflow_scheduler_1
6612c673485e        puckel/docker-airflow:1.10.9   "/entrypoint.sh webs…"   24 seconds ago      Up 24 seconds (health: starting)   5555/tcp, 8793/tcp, 0.0.0.0:8080->8080/tcp   docker-airflow_webserver_1
cffc97f915e3        puckel/docker-airflow:1.10.9   "/entrypoint.sh flow…"   24 seconds ago      Up 24 seconds                      8080/tcp, 0.0.0.0:5555->5555/tcp, 8793/tcp   docker-airflow_flower_1
24dde1b28a66        postgres:9.6                   "docker-entrypoint.s…"   25 seconds ago      Up 24 seconds                      5432/tcp                                     docker-airflow_postgres_1
3cd7cf974526        redis:5.0.5                    "docker-entrypoint.s…"   25 seconds ago      Up 24 seconds                      6379/tcp

> docker exec -it 6612c673485e bash
$ airflow connections -l
[2020-08-15 04:34:35,822] {{cli_action_loggers.py:107}} WARNING - Failed to log action with (sqlite3.OperationalError) no such table: log
[SQL: INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES (?, ?, ?, ?, ?, ?, ?)]
[parameters: ('2020-08-15 04:34:35.819743', None, None, 'cli_connections', None, 'airflow', '{"host_name": "6612c673485e", "full_command": "[\'/usr/local/bin/airflow\', \'connections\', \'-l\']"}')]
(Background on this error at: http://sqlalche.me/e/e3q8)
Traceback (most recent call last):
muthu1086 commented 3 years ago

I too ran in to the same problem for celery thats because its checking for sqlite rather than postgres.

Am just starting with both docker and airflow and figuring it the hardway.

If you look into scheduler and worker it too complain about tables are not available.

One thing not mentioned here is we need to uncomment postgres and redis stuff in docker-compose-CeleryExecutor.yml file, that way you are asking to use the postgres, after modifying that you can do

docker-compose -f docker-compose-CeleryExecutor.yml up -d

this will update the configuration.

Thanks