Open aalemanq opened 4 years ago
Any possibility to read secrets from enviroments in docker-compose? like _FILE enviroments or something similar...
Thanks for reply. I tried to apply env_file but same issue. When I use enviroments like:
LOAD_EX=y
FERNET_KEY=XXXX
EXECUTOR=Celery
AIRFLOW__CELERY__BROKER_URL=pyamqp://airflow:airflow@rabbitmq:5672/airflow
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://airflow:airflow@postgres/airflow
It's works
but if I apply secrets using this in env_file, is not work :( :
AIRFLOW__CORE__FERNET_KEY_CMD=$(cat /run/secrets/fernet_key)
AIRFLOW__CELERY__BROKER_URL_CMD=$(cat /run/secrets/broker_url)
AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD=$(cat /run/secrets/sql_alchemy_conn)
AIRFLOW__CELERY__RESULT_BACKEND_CMD=$(cat /run/secrets/result_backend)
Airflow never get broker_url and rabbitmq conection is replaced by redis...
I check if everything is right and I don't see any error about secrets/chains:
Some debug:
I deploy stack and enter in worker airflow container to check if secrets exists:
airflow@03eb98bd469d:~$ cat /run/secrets/sql_alchemy_conn
postgresql+psycopg2://airflow:airflow@postgres/airflow
airflow@03eb98bd469d:~$ cat /run/secrets/broker_url
pyamqp://airflow:airflow@rabbitmq:5672/airflow
airflow@03eb98bd469d:~$ cat /run/secrets/result_backend
db+postgresql://airflow:airflow@postgres/airflow
I execute by hand this enviroments:
airflow@03eb98bd469d:~$ LOAD_EX=y
airflow@03eb98bd469d:~$ AIRFLOW__CORE__FERNET_KEY_CMD=$(cat /run/secrets/fernet_key)
airflow@03eb98bd469d:~$ EXECUTOR=Celery
airflow@03eb98bd469d:~$ AIRFLOW__CELERY__BROKER_URL_CMD=$(cat /run/secrets/broker_url)
airflow@03eb98bd469d:~$ AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD=$(cat /run/secrets/sql_alchemy_conn)
airflow@03eb98bd469d:~$ AIRFLOW__CELERY__RESULT_BACKEND_CMD=$(cat /run/secrets/result_backend)
And echo is right:
airflow@03eb98bd469d:~$ echo $AIRFLOW__CORE__FERNET_KEY_CMD
46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
airflow@03eb98bd469d:~$ echo $AIRFLOW__CELERY__BROKER_URL_CMD
pyamqp://airflow:airflow@rabbitmq:5672/airflow
airflow@03eb98bd469d:~$ echo $AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD
postgresql+psycopg2://airflow:airflow@postgres/airflow
airflow@03eb98bd469d:~$ echo $AIRFLOW__CELERY__RESULT_BACKEND_CMD
db+postgresql://airflow:airflow@postgres/airflow
I don't know what I can do, secrets testes, enviroments testesd, no aiflow running with enviroments and secrets but without secrets it works as exepted...
In the environment file the commands are not executed.
Take a look at: https://github.com/jupyterhub/jupyterhub-deploy-docker/blob/master/docker-compose.yml#L16
https://github.com/jupyterhub/jupyterhub-deploy-docker/blob/master/Makefile#L20
Yes,that env file not exeucte anything, but you recommend me env_file and maybe with _CMD II thougth that it will works...no
Really that anybody can pass secrets via enviroments¿?¿ is normal in a lot of software I can't understand it. I tried and tried and tried and I can't .
I can't understand this makefile applied to my airflow wittfabian:(. I just want tipical workflow:
-enviroment: -ENVIROMENT_FILE=$(cat /run/secrets/file)
Really that I can't do this in airflow?!?!?
Reading this makes it sound like it is not supported, but it is quite simple. In docker compose YAML file just use something like
environment:
...
AIRFLOW__CORE__FERNET_KEY_CMD: 'cat /run/secrets/fernet_key'
...
Seems to work fine for me. No $()
required.
I have the same issues. Do not have clue
Hello, I spend so many weeks trying to configure airflow using secrets in docker swarm.
I try to use this config in my compose:
I tried with another bash commands to do cat to this secret and all fails, it never gets enviroments and use redis default.
I tried to follow oficial doc but...
https://airflow.readthedocs.io/en/stable/howto/set-config.html
`The _cmd config options can also be set using a corresponding environment variable the same way the usual config options can. For example:
export AIRFLOWCORESQL_ALCHEMY_CONN_CMD=bash_command_to_run`
Now I'm trying to run my entrypoint (a copy of original entrypoint + EXPORT enviroments, and no lucky). Is a little nightmare to me config secrets to airflow in docker swarm :(. I want to avoid copy config and do it sed... I want to use enviroments!
Regards