Closed plegg-rh closed 1 year ago
helm version
version.BuildInfo{Version:"v3.7.1+7.el8", GitCommit:"8f33223fe17957f11ba7a88b016bc860f034c4e6", GitTreeState:"clean", GoVersion:"go1.16.7"}
PyYAML version
Name: PyYAML Version: 6.0 Summary: YAML parser and emitter for Python Home-page: https://pyyaml.org/ Author: Kirill Simonov Author-email: xi@resolvent.net License: MIT Location: /usr/local/lib64/python3.6/site-packages Requires:
@plegg-rh I'm unable to reproduce this. Could you provide the role that's being included?
@gravesm , thanks for the quick reply
helm > tasks > main.yaml
---
- name: "Ensure repo ({{ helm.repo_name }}) is added to Helm for {{ helm.release_name }}"
kubernetes.core.helm_repository:
name: "{{ helm.repo_name }}"
repo_url: "{{ helm.chart_repo_url }}"
- name: "Configure release {{ helm.release_name }} on {{ ocp_cluster.env }} Cluster"
kubernetes.core.helm:
api_key: "{{ login_info.openshift_auth.api_key }}"
host: "https://api.{{ ocp_cluster.env }}.{{ install_config.base_domain }}:6443"
validate_certs: false
release_state: present
chart_ref: "{{ helm.chart_ref }}"
name: "{{ helm.release_name }}"
release_namespace: "{{ helm.release_namespace }}"
chart_repo_url: "{{ helm.chart_repo_url }}"
chart_version: "{{ helm.chart_version }}"
create_namespace: true
update_repo_cache: true
values_files:
- "{{ helm.chart_values }}"
helm > default > main.yaml
---
helm:
release_name: cert-manager
release_namespace: cert-manager
chart_repo_url: https://charts.jetstack.io/
chart_version: 1.8.2
create_namespace: true
update_repo_cache: true
chart_values: "{{ playbook_dir }}/helm-values/cert-manager-values.yaml"
repo_name: jetstack
playbook:
---
- name: Get host info for base plays
hosts: localhost
gather_facts: false
vars_files:
- charts.yaml
tasks:
- name: Login to cluster
ansible.builtin.import_role:
name: login_to_cluster
- name: Deploy Helm Charts
ansible.builtin.include_role:
name: helm
vars:
helm:
release_name: "{{ item.release_name }}"
release_namespace: "{{ item.release_namespace }}"
chart_repo_url: "{{ item.chart_repo_url }}"
chart_version: "{{ item.chart_version }}"
chart_values: "{{ item.chart_values }}"
repo_name: "{{ item.repo_name }}"
chart_ref: "{{ item.chart_ref }}"
loop: "{{ charts }}"
I've tried again in a venv with the following packages, and still getting the same result.
Package Version
------------------- --------
attrs 22.2.0
cachetools 4.2.4
certifi 2023.5.7
charset-normalizer 2.0.12
google-auth 2.20.0
idna 3.4
importlib-metadata 4.8.3
jsonschema 3.2.0
kubernetes 26.1.0
kubernetes-validate 1.26.0
oauthlib 3.2.2
openshift 0.13.1
pip 21.3.1
pyasn1 0.5.0
pyasn1-modules 0.3.0
pyrsistent 0.18.0
python-dateutil 2.8.2
python-string-utils 1.0.0
PyYAML 6.0
requests 2.27.1
requests-oauthlib 1.3.1
rsa 4.9
setuptools 39.2.0
six 1.16.0
typing_extensions 4.1.1
urllib3 1.26.16
websocket-client 1.3.1
zipp 3.6.0
I tried again to reproduce this using python 3.9, ansible 2.13 and kubernetes.core 2.4. With the role you provided I'm able to successfully deploy multiple helm charts into openshift. One thing I noticed from the exception you posted is that the module is being executed with python 3.6, though you list python 3.9 in your ansible version output. I also tried reproducing with python 3.6 but it still works for me. It might be worth setting ansible_python_interpreter
just to be absolutely certain which python interpreter and environment is being used.
I was able to succeed by using python3.6 interpreter on this machine.
I was also able to get the playbook to work in a ubi8 container with python3.9.
TL;DR it looks like this is a machine specific issue.
I'm facing the same problem.
ansible [core 2.17.3] python version = 3.12.4 (main, Jun 7 2024, 00:00:00) [GCC 14.1.1 20240607 (Red Hat 14.1.1-5)] (/usr/bin/python3) jinja version = 3.1.4
Collection Version
kubernetes.core 5.0.0
name: Deploy redhat-trusted-profile-analyzer chart kubernetes.core.helm: api_key: '{{ token }}' host: '{{ server }}' validate_certs: false name: redhat-trusted-profile-analyzer chart_ref: openshift-helm-charts/redhat-trusted-profile-analyzer chart_version: '0.1.1' release_namespace: '{{ tpa_project }}' update_repo_cache: false values: "{{ lookup('template', 'values-rhtpa.yaml.j2') | from_yaml }}"
Any idea how I can debug this further?
SUMMARY
When using kubernetes.core.helm to release a chart on the cluster, the task fails with
'AnsibleModule' object has no attribute 'env_update'
ISSUE TYPE
COMPONENT NAME
kubernetes.core.helm
ANSIBLE VERSION
COLLECTION VERSION
CONFIGURATION
OS / ENVIRONMENT
RHEL 8.7
STEPS TO REPRODUCE
Call the kubernetes.core.helm module with the following params:
invocation: api_key:
chart_ref: jetstack/cert-manager
chart_version: v1.8.2
create_namespace: true
host:
module_args:
api_key:
chart_ref: jetstack/cert-manager
chart_version: v1.8.2
create_namespace: true
host:
release_name: cert-manager
release_namespace: cert-manager
release_state: present
validate_certs: false
values_files: ~/oke-auto-deploy/ipi-cluster-install/ansible/helm-values/cert-manager-values.yaml
release_name: cert-manager
release_namespace: cert-manager
release_state: present
validate_certs: false
values_files: ~/oke-auto-deploy/ipi-cluster-install/ansible/helm-values/cert-manager-values.yaml
EXPECTED RESULTS
helm chart jetstack/cert-manager-v1.8.2 would be deployed to host with the values file.
ACTUAL RESULTS
Somewhere an attribute is being defined, by doesn't exist in helm.py.