cdrx / docker-pyinstaller

PyInstaller for Linux and Windows inside Docker
MIT License
616 stars 238 forks source link

Linux image for Python3 not running on GitLab CI Runner #38

Closed b2m closed 4 years ago

b2m commented 5 years ago

Enviroment

GitLab CI Runner in Docker mode with bash shell and current cdrx/pyinstaller-linux:python3 image.

Problem description:

Some CI servers like the current GitLab CI Runners will execute scripts in docker containers in a non-interactive way. In this mode the .bashrc will not get executed which is discussed in detail in this StackExchange thread. Therefore the changes added to the .bashrc in the current Dockerfile for the linux python3 image are ignored. This results in errors like: "pip command not found", because obviously the changes to $PATH are ignored...

Problem delimitation:

I did not explicitly check how other CI solutions currently behave regarding interactive/non-interactive shell code execution.

GabLeRoux commented 5 years ago

Hey there, I didn't try with custom requirements.txt yet, but here's how I managed to get gitlab-ci working:

https://gitlab.com/gableroux/python-windows-application-example

:v:

b2m commented 5 years ago

I really like the idea of using docker in docker to avoid the problems caused by the way the GitLab Runner "pipes" the script commands to the image.

Unfortunately this also means that the docker image for PyInstaller won't get cached and needs to be downloaded with every build. This unnecessarily increases network traffic and build time. This could be mitigated by using caching... but in my opinion this is more than should be necessary.

I guess you noticed the sh: 1: set: Illegal option -o pipefail error message in your first try?

As far as I investigated this problem the reason for this is, that GitLab CI will run some commands before piping the user provided script part(s). These commands then will get executed via sh -c "$@" in the entrypoint.sh. And sh defaults to a "lesser" shell, which then fails on the bash commands.

Again you can circumvent this by overwriting the entrypoint for the docker image in the GitLab CI configuration file and provide you own script... but it would be more comfortable for users to adapt the entrypoint.sh to use bash instead of sh to execute the user provided commands.

Skinner927 commented 5 years ago

I wanted to share my workaround for this issue (I haven't run this in GitLab CI but I belive the issues are similar).

docker run -v "$(pwd):/src/" -v pyinstaller-pyenv:/root/.pyenv --rm --entrypoint /bin/bash cdrx/pyinstaller-linux -c '\
  export PS1="$$" && \
  source ~/.bashrc && \
  PYTHON_CONFIGURE_OPTS="--enable-shared" pyenv install -s 3.6.7 && \
  pyenv rehash && \
  pyenv shell 3.6.7 && \
  python --version && \
  pip install --upgrade pip && \
  pip install -r requirements.txt && \
  pyinstaller --clean -y --dist ./dist/linux --workpath /tmp package.spec'

Some notable things are happening:

  1. -v pyinstaller-pyenv:/root/.pyenv creates a docker volume to store the pyenv environment so Python does not have to be installed every time the container runs. You may want to change the volume name (pyinstaller-pyenv before the :) to the actual name of your project (eg. -v foobar-pyenv:/root/.pyenv). It's best to keep pyenvs isolated per project to prevent race conditions and dependency issues (since we don't use a virtualenv). An alternative is to manually mount the shared volume, install all the Python versions you'll need, then in the runners mount the volume as readonly (This will require all dependencies to be installed every build, but you could mount another volume for the pip cache dir; I'm already way out of the scope of this issue).

  2. export PS1="$$" basically gives us a shell prompt which will then allow source ~/.bashrc to run properly. The first line of .bashrc checks that $PS1 is non-empty.