ansible-collections / community.kubernetes

Kubernetes Collection for Ansible
https://galaxy.ansible.com/community/kubernetes
GNU General Public License v3.0
265 stars 104 forks source link

Check mode not working properly for the helm module when deploying releases #280

Closed arcalys closed 4 years ago

arcalys commented 4 years ago
SUMMARY

I am trying to install a few Helm releases into a local and staging kubernetes clusters through Ansible using the Helm module. However, when running check mode, it always fails if the release does not exist.

It seems to be related to that change: https://github.com/ansible-collections/community.kubernetes/commit/7946b398a7fd42bd8fa8cfb02ccbf03b2d135a60#diff-25ae73979f193a414d8f45fc01c829eb4eb80cf46777c882108ceb7c20f2dc0d

It does not check, if in check_mode, if release_status is null/None.

ISSUE TYPE
COMPONENT NAME

community/kubernetes/plugins/modules/helm.py

ANSIBLE VERSION
ansible 2.9.6
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/home/user/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3/dist-packages/ansible
  executable location = /usr/bin/ansible
  python version = 3.8.5 (default, Jul 28 2020, 12:59:40) [GCC 9.3.0]
CONFIGURATION
COMMAND_WARNINGS(/home/user/Documents/Projects/Infrastructure/k8s/ansible.cfg) = False
DEFAULT_HOST_LIST(/home/user/Documents/Projects/Infrastructure/k8s/ansible.cfg) = ['/home/user/Documents/Projects/Infrastructure/k8s/hosts.yaml']
DEFAULT_VAULT_PASSWORD_FILE(/home/user/Documents/Projects/Infrastructure/k8s/ansible.cfg) = /home/user/.ansible/vault-password
RETRY_FILES_ENABLED(/home/user/Documents/Projects/Infrastructure/k8s/ansible.cfg) = False
SYSTEM_WARNINGS(/home/user/Documents/Projects/Infrastructure/k8s/ansible.cfg) = False
OS / ENVIRONMENT

Ubuntu 20.04

STEPS TO REPRODUCE

Hosts file:

---
all:
  hosts:
    local:
      context: k3d-local

Vars:

helm_releases:
  - name: traefik
    namespace: kube-system
    chart: traefik/traefik
  - name: cert-manager
    namespace: kube-system
    chart: jetstack/cert-manager
  - name: sso
    namespace: kube-system
    chart: helm-incubator/buzzfeed-sso

Playbook:

- hosts: all
  tasks:
    - name: Helm - deploy releases
      community.kubernetes.helm:
        release_name: "{{ item.name }}"
        release_namespace: "{{ item.namespace }}"
        chart_ref: "{{ item.chart }}"
        values: "{{ lookup('template', '../components/{{ item.namespace}}/{{ item.name }}/values.yaml') | from_yaml }}"
      loop: "{{ helm_releases }}"

ansible.cfg:

[defaults]
command_warnings = False
inventory = ./hosts.yaml
retry_files_enabled = False
system_warnings = False
vault_password_file = ~/.ansible/vault-password
ansible-playbook --diff playbooks/install.yaml --limit local --check
EXPECTED RESULTS

I would expect the check mode to work properly and display a changed status:

TASK [Helm - deploy releases] *********************************************************************************************************************************************************************************************************************************
changed: [local] => (item={'name': 'traefik', 'namespace': 'kube-system', 'chart': 'traefik/traefik'})
changed: [local] => (item={'name': 'cert-manager', 'namespace': 'kube-system', 'chart': 'jetstack/cert-manager'})
changed: [local] => (item={'name': 'sso', 'namespace': 'kube-system', 'chart': 'helm-incubator/buzzfeed-sso'})
ACTUAL RESULTS
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: TypeError: 'NoneType' object is not subscriptable
failed: [local] (item={'name': 'cert-manager', 'namespace': 'kube-system', 'chart': 'jetstack/cert-manager'}) => {"ansible_loop_var": "item", "changed": false, "item": {"chart": "jetstack/cert-manager", "name": "cert-manager", "namespace": "kube-system"}, "module_stderr": "Traceback (most recent call last):\n  File \"/home/user/.ansible/tmp/ansible-tmp-1603349496.6651795-84366151791904/AnsiballZ_helm.py\", line 102, in <module>\n    _ansiballz_main()\n  File \"/home/user/.ansible/tmp/ansible-tmp-1603349496.6651795-84366151791904/AnsiballZ_helm.py\", line 94, in _ansiballz_main\n    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n  File \"/home/user/.ansible/tmp/ansible-tmp-1603349496.6651795-84366151791904/AnsiballZ_helm.py\", line 40, in invoke_module\n    runpy.run_module(mod_name='ansible_collections.community.kubernetes.plugins.modules.helm', init_globals=None, run_name='__main__', alter_sys=True)\n  File \"/usr/lib/python3.8/runpy.py\", line 207, in run_module\n    return _run_module_code(code, init_globals, run_name, mod_spec)\n  File \"/usr/lib/python3.8/runpy.py\", line 97, in _run_module_code\n    _run_code(code, mod_globals, init_globals,\n  File \"/usr/lib/python3.8/runpy.py\", line 87, in _run_code\n    exec(code, run_globals)\n  File \"/tmp/ansible_community.kubernetes.helm_payload_2z7mz97a/ansible_community.kubernetes.helm_payload.zip/ansible_collections/community/kubernetes/plugins/modules/helm.py\", line 573, in <module>\n  File \"/tmp/ansible_community.kubernetes.helm_payload_2z7mz97a/ansible_community.kubernetes.helm_payload.zip/ansible_collections/community/kubernetes/plugins/modules/helm.py\", line 541, in main\nTypeError: 'NoneType' object is not subscriptable\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
Akasurde commented 4 years ago

@arcalys Thanks for reporting this issue. I will try to reproduce this at my end.

Akasurde commented 4 years ago

resolved_by_pr #281

Akasurde commented 4 years ago

@arcalys Could you please check if #281 resolves the issue for you and let me know? Thanks.

arcalys commented 4 years ago

@Akasurde Just tested it and I can confirm this resolves the issue. Thanks for the fast response !

Cheers