theforeman / foreman-ansible-modules

Ansible modules for interacting with the Foreman API and various plugin APIs such as Katello
GNU General Public License v3.0
146 stars 163 forks source link

theforeman.foreman.host handling of content_source, content_view, lifecycle_environment #1441

Open gvde opened 2 years ago

gvde commented 2 years ago
SUMMARY

theforeman.foreman.host doesn't pick up the current settings for content_source, content_view, lifecycle_environment which results in these parameter being set on each run or showing them as diff in check mode.

ISSUE TYPE
ANSIBLE VERSION
ansible [core 2.12.2]
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/home/k/k111111/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3.8/site-packages/ansible
  ansible collection location = /home/k/k111111/.ansible/collections:/usr/share/ansible/collections
  executable location = /usr/bin/ansible
  python version = 3.8.12 (default, Apr 21 2022, 07:55:08) [GCC 8.5.0 20210514 (Red Hat 8.5.0-10)]
  jinja version = 3.0.2
  libyaml = False
COLLECTION VERSION
$ ansible-galaxy collection list

# /usr/share/ansible/collections/ansible_collections
Collection         Version
------------------ -------
redhat.rhel_mgmt   1.0.0  
theforeman.foreman 3.1.0  

# /home/k/k111111/.ansible/collections/ansible_collections
Collection         Version
------------------ -------
theforeman.foreman 3.4.0  
KATELLO/FOREMAN VERSION
$ rpm -q katello foreman
katello-4.4.1-1.el8.noarch
foreman-3.2.1-1.el8.noarch
STEPS TO REPRODUCE
---
- name: configure foreman server
  hosts: foreman
  gather_facts: false

  tasks:
    - name: Set up foreman8.example.com
      theforeman.foreman.host:
        username: "{{ foreman_username | default(omit) }}"
        password: "{{ foreman_password | default(omit) }}"
        organization: "{{ foreman_organization | default(omit) }}"
        server_url: "{{ foreman_server_url | default(omit) }}"
        name: "foreman8.example.com"
        hostgroup: "alma8-base"
        lifecycle_environment: "Production"
        content_view: "EL 8"
        content_source: "{{ foreman_content_proxy }}"
        environment: "development"
        puppet_ca_proxy: "{{ foreman_proxy }}"
        puppet_proxy: "{{ foreman_puppet_proxy }}"
        interfaces_attributes:
        - type: "interface"
          mac: "de:ad:be:ef:00:11"
          domain: "example.com"
          identifier: "eth0"
          ip: "10.20.30.40"
          ip6: "1234:1234:1234::12"
          managed: true
          name: "foreman8.example.com"
          primary: true
          provision: true
          subnet: "VLAN 25 IPv4"
          subnet6: "VLAN 25 IPv6"
EXPECTED RESULTS

Set the parameters during the first run and then report ok during each following run.

ACTUAL RESULTS
$ ansible-playbook --limit foreman8.example.com hosts.yaml

PLAY [configure foreman server] *****************************************************************************************************************************************************************************

TASK [Set up foreman8.example.com] ******************************************************************************************************************************************************************************
changed: [foreman8.example.com]

PLAY RECAP **************************************************************************************************************************************************************************************************
foreman8.example.com           : ok=1    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

$ ansible-playbook --limit foreman8.example.com hosts.yaml

PLAY [configure foreman server] *****************************************************************************************************************************************************************************

TASK [Set up foreman8.example.com] ******************************************************************************************************************************************************************************
changed: [foreman8.example.com]

PLAY RECAP **************************************************************************************************************************************************************************************************
foreman8.example.com           : ok=1    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

$ ansible-playbook --limit foreman8.example.com hosts.yaml --diff --check

PLAY [configure foreman server] *****************************************************************************************************************************************************************************

TASK [Set up foreman8.example.com] ******************************************************************************************************************************************************************************
--- before
+++ after
@@ -5,6 +5,8 @@
             "build": false,
             "comment": "",
             "config_group_ids": [],
+            "content_source_id": 1,
+            "content_view_id": 4,
             "domain_id": 1,
             "enabled": true,
             "environment_id": 2,
@@ -12,6 +14,7 @@
             "hostgroup_id": 1,
             "id": 1,
             "ip": "10.20.30.40",
+            "lifecycle_environment_id": 3,
             "location_id": 2,
             "mac": "de:ad:be:ef:00:11",
             "managed": false,

changed: [foreman8.example.com]

PLAY RECAP **************************************************************************************************************************************************************************************************
foreman8.example.com           : ok=1    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

Content view, content source and lifecycle environment are in fact correctly set after the initial run. Only the task isn't picking up the current setting.

coffmant commented 8 months ago

Is there any workaround for this bug?