airflow-helm / charts

The User-Community Airflow Helm Chart is the standard way to deploy Apache Airflow on Kubernetes with Helm. Originally created in 2017, it has since helped thousands of companies create production-ready deployments of Airflow on Kubernetes.
https://github.com/airflow-helm/charts/tree/main/charts/airflow
Apache License 2.0
658 stars 475 forks source link

airflow 2.0.1 not able to db init, giving below errors #234

Closed github-sunildhiddi closed 3 years ago

github-sunildhiddi commented 3 years ago

What is your question?

A clear description of your question.

github-sunildhiddi commented 3 years ago

(pycdc) airfcdcd@xxxxxxx:/airflow/etl/pyenv/pycdc/bin> airflow db init DB: mysql+mysqlconnector://afcdcsqld:***@rn000017833:3306/afcdcd [2021-06-24 00:58:58,333] {db.py:674} INFO - Creating tables INFO [alembic.runtime.migration] Context impl MySQLImpl. INFO [alembic.runtime.migration] Will assume non-transactional DDL. INFO [alembic.runtime.migration] Running upgrade 27c6a30d7c24 -> 86770d1215c0, add kubernetes scheduler uniqueness Traceback (most recent call last):

_File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/mysql/connector/connection_cext.py", line 506, in cmd_query

raw_as_string=raw_as_string)

_mysql_connector.MySQLInterfaceError: Table 'kube_workeruuid' already exists

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context cursor, statement, parameters, context File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute cursor.execute(statement, parameters) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/mysql/connector/cursor_cext.py", line 266, in execute raw_as_string=self._raw_as_string) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/mysql/connector/connection_cext.py", line 509, in cmd_query sqlstate=exc.sqlstate) mysql.connector.errors.ProgrammingError: 1050 (42S01): Table 'kube_worker_uuid' already exists

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/airflow/etl/pyenv/pycdc/bin/airflow", line 8, in sys.exit(main()) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/airflow/main.py", line 40, in main args.func(args) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 48, in command return func(*args, kwargs) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/airflow/cli/commands/db_command.py", line 31, in initdb db.initdb() File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/airflow/utils/db.py", line 549, in initdb upgradedb() File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/airflow/utils/db.py", line 684, in upgradedb command.upgrade(config, 'heads') File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/command.py", line 294, in upgrade script.run_env() File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/script/base.py", line 481, in run_env util.load_python_file(self.dir, "env.py") File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/util/pyfiles.py", line 97, in load_python_file module = load_module_py(module_id, path) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/util/compat.py", line 182, in load_module_py spec.loader.exec_module(module) File "", line 728, in exec_module File "", line 219, in _call_with_frames_removed File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/airflow/migrations/env.py", line 108, in run_migrations_online() File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/airflow/migrations/env.py", line 102, in run_migrations_online context.run_migrations() File "", line 8, in run_migrations File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/runtime/environment.py", line 813, in run_migrations self.get_context().run_migrations(kw) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/runtime/migration.py", line 560, in run_migrations step.migration_fn(*kw) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/airflow/migrations/versions/86770d1215c0_add_kubernetes_scheduler_uniqueness.py", line 53, in upgrade table = op.create_table(RESOURCE_TABLE, columns_and_constraints) File "", line 8, in create_table File "", line 3, in create_table File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/operations/ops.py", line 1109, in create_table return operations.invoke(op) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/operations/base.py", line 354, in invoke return fn(self, operation) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/operations/toimpl.py", line 100, in create_table operations.impl.create_table(table) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/ddl/impl.py", line 277, in create_table self._exec(schema.CreateTable(table)) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/alembic/ddl/impl.py", line 146, in _exec return conn.execute(construct, multiparams) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1011, in execute return meth(self, multiparams, params) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/sql/ddl.py", line 72, in _execute_on_connection return connection._execute_ddl(self, multiparams, params) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1073, in _execute_ddl compiled, File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context e, statement, parameters, cursor, context File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception sqlalchemy_exception, with_traceback=excinfo[2], from=e File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise_ raise exception File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context cursor, statement, parameters, context File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute cursor.execute(statement, parameters) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/mysql/connector/cursor_cext.py", line 266, in execute raw_as_string=self._raw_as_string) File "/airflow/etl/pyenv/pycdc/lib/python3.7/site-packages/mysql/connector/connection_cext.py", line 509, in cmd_query sqlstate=exc.sqlstate) sqlalchemy.exc.ProgrammingError: (mysql.connector.errors.ProgrammingError) 1050 (42S01): Table 'kube_worker_uuid' already exists [SQL: CREATE TABLE kube_worker_uuid ( one_row_id BOOL NOT NULL DEFAULT true, worker_uuid VARCHAR(255), PRIMARY KEY (one_row_id), CONSTRAINT kube_worker_one_row_id CHECK (one_row_id<>0), CHECK (one_row_id IN (0, 1)) )

] (Background on this error at: http://sqlalche.me/e/13/f405)

thesuperzapper commented 3 years ago

@github-sunildhiddi why are you trying to run airflow db init this chart will automatically run it after you run helm install by deploying a job resource: https://github.com/airflow-helm/charts/blob/main/charts/airflow/templates/jobs/job-upgrade-db.yaml

thesuperzapper commented 3 years ago

@github-sunildhiddi did you resolve this issue? Can it be closed?