TractorZoom / sam-cli-action

Github Action to build, package, and deploy serverless applications using the AWS SAM CLI.
18 stars 9 forks source link

No module named 'samcli' #6

Closed z0ph closed 4 years ago

z0ph commented 4 years ago

Hello,

Describe the bug

Install aws-sam-cli latest
Successful install aws-sam-cli latest
Run sam build
Traceback (most recent call last):
  File "/bin/sam", line 5, in <module>
    from samcli.cli.main import cli
ModuleNotFoundError: No module named 'samcli'

Is this a known issue?

I saw a very similar approach and same issue using: https://github.com/youyo/aws-sam-action

cody-hoffman commented 4 years ago

@z0ph please provide information on how you came upon this bug, steps to reproduce, etc. We have this sam-cli-action running any many production pipelines at Tractor Zoom with no issues.

z0ph commented 4 years ago

Hello @cody-hoffman

Thanks for the feedback,

Actually, I'm using this GH Actions workflow:

name: "deploy pipeline"

on: 
  push:
    branches:
    - master

jobs:
  aws_sam:
    runs-on: ubuntu-latest
    steps:
      - name: Assume Role
        uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: eu-west-1
          role-to-assume: xxx-admin
          role-duration-seconds: 1200
          role-session-name: GH-Actions

      - name: Checkout
        uses: actions/checkout@master

      - name: Test credentials - whoami
        run: |
          aws sts get-caller-identity

      - name: sam build
        uses: TractorZoom/sam-cli-action@master
        with:
          sam_command: "build"
cody-hoffman commented 4 years ago

Do you have the env variables set on the sam build step?

- name: sam build
  uses: TractorZoom/sam-cli-action@master
  with: 
    sam_command: build
  env:
    AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
    AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
    AWS_DEFAULT_REGION: ${{ secrets.REGION }}
z0ph commented 4 years ago

No, I'm using the "Assume Role" step to get AWS credentials for this runner.

I've tried to set env:

Install aws-sam-cli latest
Successful install aws-sam-cli latest
Run sam build
Traceback (most recent call last):
  File "/bin/sam", line 5, in <module>
    from samcli.cli.main import cli
ModuleNotFoundError: No module named 'samcli'

Same error. Are you using same: runs-on: ubuntu-latest ?

cody-hoffman commented 4 years ago

Yes we are. I'm not familiar with how the Assume Role step works, as a debug step have you verified that passing your credentials as env variables also fails similarly?

cody-hoffman commented 4 years ago

What version of python are you on? This is currently only supporting 3.8 as we primarily develop in Node

z0ph commented 4 years ago

Yes we are. I'm not familiar with how the Assume Role step works, as a debug step have you verified that passing your credentials as env variables also fails similarly?

Yes same error using:

name: "deploy pipeline"

on: 
  push:
    branches:
    - master

jobs:
  aws_sam:
    runs-on: ubuntu-latest
    steps:
      # - name: Assume Role
      #   uses: aws-actions/configure-aws-credentials@v1
      #   with:
      #     aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
      #     aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
      #     aws-region: eu-west-1
      #     role-to-assume: zoph-admin
      #     role-duration-seconds: 1200
      #     role-session-name: GH-Actions

      - name: Checkout
        uses: actions/checkout@master

      # - name: Test credentials - whoami
      #   run: |
      #     aws sts get-caller-identity

      - name: sam build
        uses: TractorZoom/sam-cli-action@master
        with:
          sam_command: "build"
        env:
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          AWS_DEFAULT_REGION: ${{ secrets.REGION }}
z0ph commented 4 years ago

What version of python are you on? This is currently only supporting 3.8 as we primarily develop in Node

actually, I'm trying to run build without parameters. my need is to set parameters like build folder, template name and so on...

My current lambda function is in python3.7 but could easily pass to 3.8, I think the error comes earlier, at build command run.

cody-hoffman commented 4 years ago

You're be able to pass any parameters that you would normally pass to build on the command line

with: 
  sam_command: "build --parameter-overrides MY_PARAM=some-param"
z0ph commented 4 years ago

You're be able to pass any parameters that you would normally pass to build on the command line

with: 
  sam_command: "build --parameter-overrides MY_PARAM=some-param"

Ok thanks, are you sure there is no issue with the pip installation of aws-sam-cli ? Are you able to reproduce my issue using my simple GH Actions workflow ?

cody-hoffman commented 4 years ago

I just did and I am not able to reproduce my output is the following:

Install aws-sam-cli latest
Successful install aws-sam-cli latest
Run sam build

Build Succeeded

Built Artifacts  : .aws-sam/build
Built Template   : .aws-sam/build/template.yaml

Commands you can use next
=========================
[*] Invoke Function: sam local invoke
[*] Deploy: sam deploy --guided
z0ph commented 4 years ago

I'm quite new to GH Actions, I'm sure I'm missing something...

Do you see anything special on my repository? https://github.com/z0ph/s3-backup-notifier

cody-hoffman commented 4 years ago

I'm assuming sam validate works fine and you have actually set the env variables in your github repo's secrets.

Remind me, do you need to callout which template because you have two in your root or does it use template.yml by default?

Your sam template looked fine to me otherwise.

cody-hoffman commented 4 years ago

Also at the moment, we do not use Python extensively at Tractor Zoom so there's a chance there's a version clash between 3.7 and 3.8 causing this but I'm not a Python expert

z0ph commented 4 years ago

Hello Cody, Following a reorganization on my repository all starting to work as expected. The error message was really misleading!

Also, I think you could remove env variable on the build part of your example because it's not using AWS credentials to do the build phase of the artifacts, its local to the Docker Container. Only the deploy phase needs credentials to upload the artifacts and deploy the cloudformation template. Hope this helps

z0ph commented 4 years ago

Thanks a lot for your help! was really helpful and nice