serverless / serverless-python-requirements

⚡️🐍📦 Serverless plugin to bundle Python packages
MIT License
1.11k stars 291 forks source link

GitLab CI : [Errno 2] No such file or directory: '/var/task/requirements.txt #706

Open newza-fullmetal opened 2 years ago

newza-fullmetal commented 2 years ago

I would like to use this plugin in a CI in gitlab. I own a gitlab Runner with rights to use docker in order do avoid docker:dind service (same error anyway on shared runners)

My image is a node-alpine so it is a linux platform. And the build fail during the dockerization with the lambci/lambda:build-python3.8. For an unknown reason I have an

Serverless: Running docker run --rm -v /builds/nanocloud/data-analysis/nanobis/.serverless/requirements\:/var/task\:z lambci/lambda\:build-python3.8 /bin/sh -c 'python3.8 -m pip install -t /var/task/ -r /var/task/requirements.txt && chown -R 0\\:0 /var/task && find /var/task -name \\*.so -exec strip \\{\\} \\;'...
Serverless: Stdout: 
Serverless: Stderr: ERROR: Could not open requirements file: [Errno 2] No such file or directory: '/var/task/requirements.txt'

The requirement file is not found but it works perfectly fine in local on mac OS. (darwin so non-linux)

here the serverless config part :

 custom:
  pythonRequirements:
    dockerizePip: true # Compiling non-pure-Python modules
    # zip: true # useless with layer (already a zip)
    useDownloadCache: false # enabled by default
    useStaticCache: false # enabled by default
    # invalidateCaches: true # only usefull if we use our own python lib
    layer: true # share the requirement between all functions
    slim: true # This will: strip the .so files, remove __pycache__ and dist-info directories as well as .pyc and .pyo files. 
    slimPatterns: # To add our own file pattern to slim
      - '**/tests/'    

the gitlab CI :

 image: node:lts-alpine # basic image for serverless
# image: nikolaik/python-nodejs:python3.8-nodejs16-bullseye # need real debian to get some necessary tool for python compilation and also python & node 

stages:
  - build
  - deploy

before_script:
  - mkdir -p ~/.aws/
  - echo $AWS_CREDENTIALS | base64 -d > ~/.aws/credentials
  - echo $AWS_CONFIG | base64 -d > ~/.aws/config
  - echo $AWS_CONFIG | base64 -d 
  - export AWS_SDK_LOAD_CONFIG=1 

cache:
  key: $CI_COMMIT_REF_SLUG
  paths:
    - .npm/
    - node_modules/

publish:
  stage: build
  script:
    - npm ci
  only:
    - master
    - tags

deploy_dev:
  stage: deploy
  variables:
    ENV: dev
  script:
    - apk add --no-cache docker docker-cli
    - docker info
    - AWS_PROFILE=TOTO ./node_modules/.bin/serverless deploy -s $ENV
  only:
    - master

I have been able to deploy successfully by disabling dockerizePip and building locally (with the commented image on the ci) but we lose an important ability of the plugin because it is not same env of a lambda.

Furthermore I have noticed someting weird in the file index.js of the lib. if we declare 'non-linux' for dockerizePip but we really are it deactivate the function. If we are on darwin it is still activated for exemple. l.65 - 67

     if (options.dockerizePip === 'non-linux') {
      options.dockerizePip = process.platform !== 'linux';
    }
newza-fullmetal commented 2 years ago

MAYBE A SOLUTION

at least for gitlab runner

Cause of the problem of missing files in Docker in Docker : Basically when running in a container the files are on the host machine and when you try to mount a volume from local path it does not mean anything because files are note really here. Really good explanation here : https://gitlab.com/gitlab-org/gitlab-foss/-/issues/41227#note_52029664

I think it is the same for any ci env using docker gitlab, jenkins etc...

So what it is important, during the operation of the plugin serverless-python-requirements the file are on the host machine under /builds/YOUPROJECT

it is this path you need to send to the docker container that zip the requirements . After logging a lot the library, it seems that the way it gets the path is good the function getBindPath in docker.js send back the path on the HOST machine.

So how to make it work:

using you own gitlab runner :

Change the config of your runner to allow to share the /builds folder between container. My config : (note the privileged= true and the volumes line)

  name = "my-project"
  url = "https://gitlab.com/"
  executor = "docker"
  [runners.custom_build_dir]
  [runners.cache]
    [runners.cache.s3]
    [runners.cache.gcs]
    [runners.cache.azure]
  [runners.docker]
    tls_verify = false
    image = "docker:19.03.12"
    privileged = true
    disable_entrypoint_overwrite = false
    oom_kill_disable = false
    disable_cache = false
    volumes = ["/var/run/docker.sock:/var/run/docker.sock", "/cache", "/builds:/builds"]
    pull_policy = ["always", "if-not-present"]
    shm_size = 0
using shared gitlab runner :

For now I do not know, it should work out of the box based on what I found. I can retry maybe later to find what is missing.

I hope it could help some to understand this problem I have found on a lot of topics but without resolution

geodude-tech commented 1 year ago

I am running an autoscaling docker-machine gitlab runner on an EC2. My gitlab ci runs on a python:3.9 image. And one of my jobs runs cdk deploy where it uses docker to bundle a python function lambda. I had to install docker in the gitlab ci job for the bundling to work and kept getting the error: ERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements.txt'.

After much googling and trying many things, your suggestion to add "/builds:/builds" to the volumes section in the config.toml is exactly what I needed to fix the issue. Thank you very much!

yswtrue commented 1 year ago

MAYBE A SOLUTION

at least for gitlab runner

Cause of the problem of missing files in Docker in Docker : Basically when running in a container the files are on the host machine and when you try to mount a volume from local path it does not mean anything because files are note really here. Really good explanation here : https://gitlab.com/gitlab-org/gitlab-foss/-/issues/41227#note_52029664

I think it is the same for any ci env using docker gitlab, jenkins etc...

So what it is important, during the operation of the plugin serverless-python-requirements the file are on the host machine under /builds/YOUPROJECT

it is this path you need to send to the docker container that zip the requirements . After logging a lot the library, it seems that the way it gets the path is good the function getBindPath in docker.js send back the path on the HOST machine.

So how to make it work:

using you own gitlab runner :

Change the config of your runner to allow to share the /builds folder between container. My config : (note the privileged= true and the volumes line)

  name = "my-project"
  url = "https://gitlab.com/"
  executor = "docker"
  [runners.custom_build_dir]
  [runners.cache]
    [runners.cache.s3]
    [runners.cache.gcs]
    [runners.cache.azure]
  [runners.docker]
    tls_verify = false
    image = "docker:19.03.12"
    privileged = true
    disable_entrypoint_overwrite = false
    oom_kill_disable = false
    disable_cache = false
    volumes = ["/var/run/docker.sock:/var/run/docker.sock", "/cache", "/builds:/builds"]
    pull_policy = ["always", "if-not-present"]
    shm_size = 0
using shared gitlab runner :

For now I do not know, it should work out of the box based on what I found. I can retry maybe later to find what is missing.

I hope it could help some to understand this problem I have found on a lot of topics but without resolution

This works for me, but only gitlab-runner installed by apt, not running by docker