Open newza-fullmetal opened 2 years ago
Cause of the problem of missing files in Docker in Docker : Basically when running in a container the files are on the host machine and when you try to mount a volume from local path it does not mean anything because files are note really here. Really good explanation here : https://gitlab.com/gitlab-org/gitlab-foss/-/issues/41227#note_52029664
I think it is the same for any ci env using docker gitlab, jenkins etc...
So what it is important, during the operation of the plugin serverless-python-requirements the file are on the host machine under /builds/YOUPROJECT
it is this path you need to send to the docker container that zip the requirements . After logging a lot the library, it seems that the way it gets the path is good the function getBindPath in docker.js send back the path on the HOST machine.
So how to make it work:
Change the config of your runner to allow to share the /builds folder between container.
My config : (note the privileged= true
and the volumes line)
name = "my-project"
url = "https://gitlab.com/"
executor = "docker"
[runners.custom_build_dir]
[runners.cache]
[runners.cache.s3]
[runners.cache.gcs]
[runners.cache.azure]
[runners.docker]
tls_verify = false
image = "docker:19.03.12"
privileged = true
disable_entrypoint_overwrite = false
oom_kill_disable = false
disable_cache = false
volumes = ["/var/run/docker.sock:/var/run/docker.sock", "/cache", "/builds:/builds"]
pull_policy = ["always", "if-not-present"]
shm_size = 0
For now I do not know, it should work out of the box based on what I found. I can retry maybe later to find what is missing.
I hope it could help some to understand this problem I have found on a lot of topics but without resolution
I am running an autoscaling docker-machine gitlab runner on an EC2. My gitlab ci runs on a python:3.9 image. And one of my jobs runs cdk deploy
where it uses docker to bundle a python function lambda. I had to install docker in the gitlab ci job for the bundling to work and kept getting the error: ERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements.txt'.
After much googling and trying many things, your suggestion to add "/builds:/builds" to the volumes section in the config.toml is exactly what I needed to fix the issue. Thank you very much!
MAYBE A SOLUTION
at least for gitlab runner
Cause of the problem of missing files in Docker in Docker : Basically when running in a container the files are on the host machine and when you try to mount a volume from local path it does not mean anything because files are note really here. Really good explanation here : https://gitlab.com/gitlab-org/gitlab-foss/-/issues/41227#note_52029664
I think it is the same for any ci env using docker gitlab, jenkins etc...
So what it is important, during the operation of the plugin serverless-python-requirements the file are on the host machine under /builds/YOUPROJECT
it is this path you need to send to the docker container that zip the requirements . After logging a lot the library, it seems that the way it gets the path is good the function getBindPath in docker.js send back the path on the HOST machine.
So how to make it work:
using you own gitlab runner :
Change the config of your runner to allow to share the /builds folder between container. My config : (note the
privileged= true
and the volumes line)name = "my-project" url = "https://gitlab.com/" executor = "docker" [runners.custom_build_dir] [runners.cache] [runners.cache.s3] [runners.cache.gcs] [runners.cache.azure] [runners.docker] tls_verify = false image = "docker:19.03.12" privileged = true disable_entrypoint_overwrite = false oom_kill_disable = false disable_cache = false volumes = ["/var/run/docker.sock:/var/run/docker.sock", "/cache", "/builds:/builds"] pull_policy = ["always", "if-not-present"] shm_size = 0
using shared gitlab runner :
For now I do not know, it should work out of the box based on what I found. I can retry maybe later to find what is missing.
I hope it could help some to understand this problem I have found on a lot of topics but without resolution
This works for me, but only gitlab-runner installed by apt, not running by docker
I would like to use this plugin in a CI in gitlab. I own a gitlab Runner with rights to use docker in order do avoid docker:dind service (same error anyway on shared runners)
My image is a node-alpine so it is a linux platform. And the build fail during the dockerization with the lambci/lambda:build-python3.8. For an unknown reason I have an
The requirement file is not found but it works perfectly fine in local on mac OS. (darwin so non-linux)
here the serverless config part :
the gitlab CI :
I have been able to deploy successfully by disabling dockerizePip and building locally (with the commented image on the ci) but we lose an important ability of the plugin because it is not same env of a lambda.
Furthermore I have noticed someting weird in the file index.js of the lib. if we declare 'non-linux' for dockerizePip but we really are it deactivate the function. If we are on darwin it is still activated for exemple. l.65 - 67