koslib / helm-eks-action

The simplest Github Action for executing Helm commands on EKS - cluster authentication included
MIT License
62 stars 61 forks source link

unknown flag: --values #1

Closed mingujotemp closed 4 years ago

mingujotemp commented 4 years ago

Hey pal, thanks for your work!

Our team is trying to use your helm eks action, and we're seeing some issues that is hard to debug.

We're trying to manage one of the stable charts (stable/airflow) but it complains to helm repo update first.

So, our command in helm deploy section is helm repo update && helm upgrade airflow-staging stable/airflow --namespace=staging --values staging-values.yaml

However, somehow helm doesn't understand the flag even if it's the right version.

No matter what we provide, it doesn't understand the flags (--install, --wait, ...)

Do you see why? image

koslib commented 4 years ago

@mingujotemp thanks for bringing this up!

  1. Can you try to run this again? I see that you use the master branch version directly, I've pushed a change a while ago, so just rerun your jobs and let me know how it went.
  2. Is it possible to copy-paste here the exact error log? Will be more helpful for me to also understand what happens and where exactly and try to help.

Thanks!

mingujotemp commented 4 years ago

hey @koslibpro! It seems like it's still happening,

here's my stacktrace from github action ui

/usr/bin/docker run --name c201f506b6cf121046eba8ed95e10619543c_7e66e2 --label 87c201 --workdir /github/workspace --rm -e AWS_DEFAULT_REGION -e AWS_REGION -e AWS_ACCESS_KEY_ID -e AWS_SECRET_ACCESS_KEY -e KUBE_CONFIG_DATA -e INPUT_COMMAND -e HOME -e GITHUB_JOB -e GITHUB_REF -e GITHUB_SHA -e GITHUB_REPOSITORY -e GITHUB_REPOSITORY_OWNER -e GITHUB_RUN_ID -e GITHUB_RUN_NUMBER -e GITHUB_ACTOR -e GITHUB_WORKFLOW -e GITHUB_HEAD_REF -e GITHUB_BASE_REF -e GITHUB_EVENT_NAME -e GITHUB_SERVER_URL -e GITHUB_API_URL -e GITHUB_GRAPHQL_URL -e GITHUB_WORKSPACE -e GITHUB_ACTION -e GITHUB_EVENT_PATH -e RUNNER_OS -e RUNNER_TOOL_CACHE -e RUNNER_TEMP -e RUNNER_WORKSPACE -e ACTIONS_RUNTIME_URL -e ACTIONS_RUNTIME_TOKEN -e ACTIONS_CACHE_URL -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/class101-airflow/class101-airflow":"/github/workspace" 87c201:f506b6cf121046eba8ed95e10619543c  "helm repo update ; helm upgrade airflow-staging stable/airflow --install --version \"7.1.5\" -n staging --set airflow.config.AIRFLOW__KUBERNETES__GIT_BRANCH=abe/helm_github_action --set dags.git.ref=abe/helm_github_action --values staging-values.yaml"
Installed plugin: secrets
Error: unknown flag: --install
mingujotemp commented 4 years ago

My github workflow yaml looks like below:

name: Deploy

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2

      - name: Jberlinsky AWS Credentials
        uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ap-northeast-2

      - name: Set output
        id: vars
        run: echo ::set-output name=short_ref::${GITHUB_REF#refs/*/}

      - name: helm deploy
        uses: koslibpro/helm-eks-action@master
        env:
          KUBE_CONFIG_DATA: ${{ secrets.KUBECONFIG }}
        with:
          command: helm repo update ; helm upgrade airflow-staging stable/airflow --install --version "7.1.5" -n staging --set airflow.config.AIRFLOW__KUBERNETES__GIT_BRANCH=abe/helm_github_action --set dags.git.ref=abe/helm_github_action --values staging-values.yaml

You can ignore Set output section

koslib commented 4 years ago

@mingujotemp Hmm that is indeed very weird. Just a heads up, the helm version which is provided in the github action is helm3. Could it be the root cause of it, do you have any previous knowledge in this case?

I'll also try to run the exact same command on a test environment.

update: I executed this helm command on a test kubernetes env (on Docker for Mac), and it worked out as expected. I'd say this issue does not look something specific to this github action, rather than a helm problem.

mingujotemp commented 4 years ago

hey @koslibpro, again, thanks for looking into this. We do use helm3.

I've looked a bit more around it, and it looks like a special character escaping issue in your subshells in entrypoint.sh. It looks like result="$($1)" somehow denies to run multiple commands.

Can you at least add helm repo add stable https://kubernetes-charts.storage.googleapis.com/ in here and push the change so that I can avoid putting multi commands in your subshell? Let me actually make a pr for you.

That might solve our issue. I also think many people would need to access stable charts.

koslib commented 4 years ago

@mingujotemp You are 100% right. I've managed to reproduce this issue on a separate branch. It seems the problem comes with the multiline commands.

I'm currently looking into it - please let me know if you have any good ideas as well.

Right after we resolve this, I'll also merge PR#2 to get the stable charts repo inside by default.

koslib commented 4 years ago

Update: Changed the entrypoint altogether, and now things seem to work with multiline commands. I used it like this:

command: |
  echo "first command"
  echo "second command"

Please confirm that is the case for you also, to close this issue. Thanks!

mingujotemp commented 4 years ago

Nice! thanks a lot. That would work like charm for me.

I can now close the pr that I opened but would you want me to add the stable chart addition parts to Dockerfile? I can open a pr if you want.

koslib commented 4 years ago

Nice!

It'd make sense to add the stable charts repo in the Dockerfile so that it exists there by default. I'll leave this is up to you, if you have some time feel free to send in a PR, otherwise I'll take care of it as soon as I find some more time :D

Thanks so much for your help and effort to debug and fix this case!