puckel / docker-airflow

Docker Apache Airflow
Apache License 2.0
3.77k stars 541 forks source link

libpq error #342

Open agiz opened 5 years ago

agiz commented 5 years ago

I get the following error when building image myself using docker build --rm --build-arg AIRFLOW_DEPS="datadog,dask" --build-arg PYTHON_DEPS="flask_oauthlib>=0.9" -t agiz/docker-airflow:1.10.2 .:

ImportError: libpq.so.5: cannot open shared object file: No such file or directory

seanjohn7944 commented 5 years ago

I can confirm this error. When building an image with the provided Dockerfile I see the following error when the image is run:

airflow-worker | [2019-04-08 21:18:19,697] {{settings.py:174}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1 airflow-worker | Traceback (most recent call last): airflow-worker | File "/usr/local/bin/airflow", line 21, in airflow-worker | from airflow import configuration airflow-worker | File "/usr/local/lib/python3.7/site-packages/airflow/init.py", line 36, in airflow-worker | from airflow import settings, configuration as conf airflow-worker | File "/usr/local/lib/python3.7/site-packages/airflow/settings.py", line 266, in airflow-worker | configure_orm() airflow-worker | File "/usr/local/lib/python3.7/site-packages/airflow/settings.py", line 188, in configure_orm airflow-worker | engine = create_engine(SQL_ALCHEMY_CONN, *engine_args) airflow-worker | File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/init.py", line 431, in create_engine airflow-worker | return strategy.create(args, kwargs) airflow-worker | File "/usr/local/lib/python3.7/site-packages/sqlalchemy/engine/strategies.py", line 87, in create airflow-worker | dbapi = dialect_cls.dbapi(dbapi_args) airflow-worker | File "/usr/local/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 599, in dbapi airflow-worker | import psycopg2 airflow-worker | File "/usr/local/lib/python3.7/site-packages/psycopg2/init.py", line 50, in airflow-worker | from psycopg2._psycopg import ( # noqa airflow-worker | ImportError: libpq.so.5: cannot open shared object file: No such file or directory

kajari1verma commented 5 years ago

I am also getting the same error

jjunior84 commented 5 years ago

I´ve just face of the same problem, I did not fix but i could workaround it. I opened Dockerfile and change AIRFLOW_VERSION arguments from 1.10.2 to 1.10.1 ARG AIRFLOW_VERSION=1.10.1

bgkelly commented 5 years ago

You can also add libpq5 to the build of the underlying image

kajari1verma commented 5 years ago

I added libpq5, still getting same error -> Linux

kohonen commented 5 years ago

same problem here

yunus89 commented 5 years ago

Same problem here

kajari1verma commented 5 years ago

By adding below two library problem is resolved on mac but still facing the issue on Linux 'pip install psycopg2 \ && pip install psycopg2-binary '

mrhorvath commented 5 years ago

I added libpq5 and the issue cleared up. Be mindful not to add it to the buildDeps section, as all of those libraries are removed at the end of the build.

agiz commented 5 years ago

I've made a PR #349.

stevenmanton commented 5 years ago

This worked for me: pip install psycopg2-binary redis>=3.2.0, which you can set with PYTHON_DEPS.

paulforan commented 5 years ago

Wierd this issue just started for me today but not there yesterday....

What worked for me was to add 'psycopg2-binary redis>=3.2.0' to the PYTHON_DEPS in the Dockerfile. Didnt have todo that yesterday with same source code.... very strange... clearly something outside my control has changed.... wish I knew what....grrrr and lol!

thierryturpin commented 5 years ago

I had the same problem, it's related to the psycopg2 release. Created #370 for it.

hbeadles commented 5 years ago

The way I solved it is by removing the postgres reference in the apache airflow line -- pip install apache-airflow[crypto,celery,hive,jdbc,password]==$AIRFLOW_VERSION , and just installed psycopg2-binary above it

paulforan commented 5 years ago

Thanks all... does anybody know what changed yo cause this reconfiguration to the dockerfile? Just curious...

On Sat, 25 May 2019, 04:03 Hayden Beadles, notifications@github.com wrote:

The way I solved it is by removing the postgres reference in the apache airflow line -- pip install apache-airflow[crypto,celery,hive,jdbc,password]==$AIRFLOW_VERSION , and just installed psycopg2-binary above it

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/puckel/docker-airflow/issues/342?email_source=notifications&email_token=AESM3VGFYTTYTTGIM63NHU3PXCUBLA5CNFSM4HEJK6L2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWG5MCY#issuecomment-495834635, or mute the thread https://github.com/notifications/unsubscribe-auth/AESM3VGEX6HBKOFNSJHVOD3PXCUBLANCNFSM4HEJK6LQ .

hbeadles commented 5 years ago

@paulforan I can take a shot at answering. So from version 2.8 forward, psycopg2 no longer bundles binary packages. It bundles those separately in a new packages - psycopg2-binary. More information on the change can be found here: http://initd.org/psycopg/articles/2018/02/08/psycopg-274-released/. If you included postgres in that apache-airflow pip install line, it installs the newest version of postgres, (2.8.2), which does not have the binary packages included anymore. So to get around that, just remove postgres from there and install psycopg2-binary separately to get those dependencies included. More information can be found here -- https://github.com/puckel/docker-airflow/pull/349

thierryturpin commented 5 years ago

Yes, just like @hbeadles is saying. Also in Airflow in your task log you see: "The psycopg2 wheel package will be renamed from release 2.8; in order to keep installing from binary please use "pip install psycopg2-binary" instead. For details see: http://initd.org/psycopg/docs/install.html#binary-install-from-pypi."

paulforan commented 5 years ago

Cool... Nice answer guys... So the moral of the story is to fix dependency versions within docker file as much as possible to avoid stuff from breaking in the future.

Ie to productionize the container building process. :-)

On Sun, 26 May 2019, 06:27 thierryturpin, notifications@github.com wrote:

Yes, just like @hbeadles https://github.com/hbeadles is saying. Also in Airflow in your task log you see: "The psycopg2 wheel package will be renamed from release 2.8; in order to keep installing from binary please use "pip install psycopg2-binary" instead. For details see: http://initd.org/psycopg/docs/install.html#binary-install-from-pypi."

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/puckel/docker-airflow/issues/342?email_source=notifications&email_token=AESM3VDKEX5YEF7YFDONUS3PXINSTA5CNFSM4HEJK6L2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWH6SXQ#issuecomment-495970654, or mute the thread https://github.com/notifications/unsubscribe-auth/AESM3VDLPCQDRDGDYC4UA4DPXINSTANCNFSM4HEJK6LQ .

okeefj22 commented 5 years ago

Adding the above suggestion of fixing psycopg2-binary==2.8.3 previously fixed this for me but I'm now seeing it happen again

thierryturpin commented 5 years ago

Probably another version conflict. In fact on my side I specified all the package versions:

    && pip install -U pip==19.0.1 setuptools==40.7.0 wheel==0.32.3 \
    && pip install pytz==2018.9  \
    && pip install pyOpenSSL==19.0.0 \
    && pip install ndg-httpsclient==0.5.1 \
    && pip install pyasn1==0.4.5 \
    && pip install psycopg2==2.7.7 \
KIRY4 commented 5 years ago

Same issue. Faced it today.


[2019-07-08 08:29:15,214] {{settings.py:174}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1 Traceback (most recent call last): File "/usr/local/bin/airflow", line 21, in from airflow import configuration File "/usr/local/lib/python3.6/site-packages/airflow/init.py", line 36, in from airflow import settings, configuration as conf File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 266, in configure_orm() File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 188, in configure_orm engine = create_engine(SQL_ALCHEMY_CONN, *engine_args) File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/init.py", line 443, in create_engine return strategy.create(args, kwargs) File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/strategies.py", line 87, in create dbapi = dialect_cls.dbapi(dbapi_args) File "/usr/local/lib/python3.6/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 599, in dbapi import psycopg2 File "/usr/local/lib/python3.6/site-packages/psycopg2/init.py", line 50, in from psycopg2._psycopg import ( # noqa ImportError: libpq.so.5: cannot open shared object file: No such file or directory

========== && pip install psycopg2==2.7.7 \ This fix helps me.

arunkollan commented 5 years ago

I had a similar issues. I updated the docker file in the airflow directory FROM puckel/docker-airflow:1.10.2 to FROM puckel/docker-airflow:1.10.3 ..

About psycopg, the system messages from ariflow worker_1 did complain

/usr/local/lib/python3.6/site-packages/psycopg2/init.py:144: UserWarning: The psycopg2 wheel package will be renamed from release 2.8; in order to keep installing from binary please use "pip install psycopg2-binary" instead. For details see: http://initd.org/psycopg/docs/install.html#binary-install-from-pypi.

Ignoring for now..

mdrijwan123 commented 5 years ago

@paulforan I can take a shot at answering. So from version 2.8 forward, psycopg2 no longer bundles binary packages. It bundles those separately in a new packages - psycopg2-binary. More information on the change can be found here: http://initd.org/psycopg/articles/2018/02/08/psycopg-274-released/. If you included postgres in that apache-airflow pip install line, it installs the newest version of postgres, (2.8.2), which does not have the binary packages included anymore. So to get around that, just remove postgres from there and install psycopg2-binary separately to get those dependencies included. More information can be found here -- #349

Yes This worked in my case, Thanks

imsrgadich commented 5 years ago

if someone is using kube-airflow. Use imageTag as latest in airflow/values.yaml. It solved me the issue.