kubevirt / kubevirt.core

Lean Ansible bindings for KubeVirt
Apache License 2.0
12 stars 11 forks source link

kubevirt vm creation fails with error about missing module support code for ansible_collections.kubevirt.core.plugins.modules.kubevirt_vm #34

Closed philrich closed 6 months ago

philrich commented 11 months ago
SUMMARY
  localhost failed | msg: Could not find imported module support code for ansible_collections.kubevirt.core.plugins.modules.kubevirt_vm.  Looked for (['ansible_collections.kubernetes.core.plugins.module_utils.k8s.core.AnsibleK8SModule', 'ansible_collections.kubernetes.core.plugins.module_utils.k8s.core'])

When trying to create a vm with kubevirt collection

ISSUE TYPE
ANSIBLE VERSION
ansible [core 2.14.2]
  config file = /home/openshift/git/cluster-setup/ansible/ansible.cfg
  configured module search path = ['/home/openshift/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3.11/site-packages/ansible
  ansible collection location = /home/openshift/.ansible/collections:/usr/share/ansible/collections
  executable location = /usr/bin/ansible
  python version = 3.11.2 (main, Feb 17 2023, 09:28:16) [GCC 8.5.0 20210514 (Red Hat 8.5.0-18)] (/usr/bin/python3.11)
  jinja version = 3.1.2
  libyaml = True
COLLECTION VERSION
# /usr/lib/python3.11/site-packages/ansible_collections
Collection                    Version
----------------------------- -------
amazon.aws                    5.2.0
ansible.netcommon             4.1.0
ansible.posix                 1.5.1
ansible.utils                 2.9.0
ansible.windows               1.13.0
arista.eos                    6.0.0
awx.awx                       21.11.0
azure.azcollection            1.14.0
check_point.mgmt              4.0.0
chocolatey.chocolatey         1.4.0
cisco.aci                     2.3.0
cisco.asa                     4.0.0
cisco.dnac                    6.6.3
cisco.intersight              1.0.23
cisco.ios                     4.3.1
cisco.iosxr                   4.1.0
cisco.ise                     2.5.12
cisco.meraki                  2.15.0
cisco.mso                     2.2.1
cisco.nso                     1.0.3
cisco.nxos                    4.0.1
cisco.ucs                     1.8.0
cloud.common                  2.1.2
cloudscale_ch.cloud           2.2.4
community.aws                 5.2.0
community.azure               2.0.0
community.ciscosmb            1.0.5
community.crypto              2.10.0
community.digitalocean        1.23.0
community.dns                 2.5.0
community.docker              3.4.0
community.fortios             1.0.0
community.general             6.3.0
community.google              1.0.0
community.grafana             1.5.3
community.hashi_vault         4.1.0
community.hrobot              1.7.0
community.libvirt             1.2.0
community.mongodb             1.4.2
community.mysql               3.5.1
community.network             5.0.0
community.okd                 2.2.0
community.postgresql          2.3.2
community.proxysql            1.5.1
community.rabbitmq            1.2.3
community.routeros            2.7.0
community.sap                 1.0.0
community.sap_libs            1.4.0
community.skydive             1.0.0
community.sops                1.6.0
community.vmware              3.3.0
community.windows             1.12.0
community.zabbix              1.9.1
containers.podman             1.10.1
cyberark.conjur               1.2.0
cyberark.pas                  1.0.17
dellemc.enterprise_sonic      2.0.0
dellemc.openmanage            6.3.0
dellemc.os10                  1.1.1
dellemc.os6                   1.0.7
dellemc.os9                   1.0.4
dellemc.powerflex             1.5.0
dellemc.unity                 1.5.0
f5networks.f5_modules         1.22.0
fortinet.fortimanager         2.1.7
fortinet.fortios              2.2.2
frr.frr                       2.0.0
gluster.gluster               1.0.2
google.cloud                  1.1.2
grafana.grafana               1.1.0
hetzner.hcloud                1.9.1
hpe.nimble                    1.1.4
ibm.qradar                    2.1.0
ibm.spectrum_virtualize       1.11.0
infinidat.infinibox           1.3.12
infoblox.nios_modules         1.4.1
inspur.ispim                  1.2.0
inspur.sm                     2.3.0
junipernetworks.junos         4.1.0
kubernetes.core               2.3.2
lowlydba.sqlserver            1.3.1
mellanox.onyx                 1.0.0
netapp.aws                    21.7.0
netapp.azure                  21.10.0
netapp.cloudmanager           21.22.0
netapp.elementsw              21.7.0
netapp.ontap                  22.2.0
netapp.storagegrid            21.11.1
netapp.um_info                21.8.0
netapp_eseries.santricity     1.4.0
netbox.netbox                 3.10.0
ngine_io.cloudstack           2.3.0
ngine_io.exoscale             1.0.0
ngine_io.vultr                1.1.3
openstack.cloud               1.10.0
openvswitch.openvswitch       2.1.0
ovirt.ovirt                   2.4.1
purestorage.flasharray        1.16.2
purestorage.flashblade        1.10.0
purestorage.fusion            1.3.0
sensu.sensu_go                1.13.2
splunk.es                     2.1.0
t_systems_mms.icinga_director 1.32.0
theforeman.foreman            3.8.0
vmware.vmware_rest            2.2.0
vultr.cloud                   1.7.0
vyos.vyos                     4.0.0
wti.remote                    1.0.4

# /home/openshift/.ansible/collections/ansible_collections
Collection           Version
-------------------- -------
cloud.common         2.1.3
community.kubernetes 2.0.1
kubernetes.core      2.3.1
kubevirt.core        1.1.0
STEPS TO REPRODUCE

time ansible-playbook -i environments/ocp-play playbooks/kubevirt/vm_create.yaml -C fails

---
- name: Create VM
  hosts: localhost
  gather_facts: no

  tasks:
  - name: Create VM
    kubevirt.core.kubevirt_vm:
      state: present
      name: testvm
      namespace: infra-vms
      labels:
        app: svc-rhel8-test-1
      instancetype:
        name: medium
      preference:
        name: rhel8
      data_volume_templates:
        - metadata:
            name: svc-rhel8-test-1-rootdisk
          spec:
            source:
              pvc:
                name: rhel8-rhel8-mbr-none
                namespace: template-vms
            storage:
              resources:
                requests:
                  storage: 30Gi
      spec:
        hostname: svc-rhel8-test-1
        domain:
          cpu:
            cores: 2
            sockets: 1
            threads: 1
          devices:
            disks:
              - bootOrder: 1
                disk:
                  bus: virtio
                name: rootdisk
              - bootOrder: 2
                disk:
                  bus: virtio
                name: cloudinitdisk
            interfaces:
              - bridge: {}
                macAddress: '02:28:c6:00:00:29'
                model: virtio
                name: nic-0
            networkInterfaceMultiqueue: true
            rng: {}
          features:
            acpi: {}
            smm:
              enabled: false
          firmware:
            bootloader:
              bios: {}
          machine:
            type: q35
          resources:
            requests:
              memory: 2Gi
        networks:
          - multus:
              networkName: br1-vlan1
            name: nic-0
        volumes:
          - dataVolume:
              name: svc-rhel8-test-1-rootdisk
            name: rootdisk
          - cloudInitNoCloud:
              networkData:
                version: 1
                config:
                - type: physical
                  name: eth0
                  subnets:
                  - type: static
                    address: "192.168.1.1"
                    netmask: "255.255.255.0"
                    gateway: "192.168.1.0"

              userData: |
                hostname: svc-rhel8-test-1
                fqdn: svc-rhel8-test-1.test
                ssh_pwauth: true
                chpasswd:
                  list: |
                    root:xxxxxxxx
                  expire: false
            name: cloudinitdisk
        terminationGracePeriodSeconds: 180
      wait: yes
EXPECTED RESULTS

vm should be created

ACTUAL RESULTS
ansible-playbook [core 2.14.2]
  config file = /home/openshift/git/cluster-setup/ansible/ansible.cfg
  configured module search path = ['/home/openshift/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3.11/site-packages/ansible
  ansible collection location = /home/openshift/.ansible/collections:/usr/share/ansible/collections
  executable location = /usr/bin/ansible-playbook
  python version = 3.11.2 (main, Feb 17 2023, 09:28:16) [GCC 8.5.0 20210514 (Red Hat 8.5.0-18)] (/usr/bin/python3.11)
  jinja version = 3.1.2
  libyaml = True
Using /home/openshift/git/cluster-setup/ansible/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /home/openshift/git/cluster-setup/ansible/environments/ocp-play/hosts.yaml as it did not pass its verify_file() method
script declined parsing /home/openshift/git/cluster-setup/ansible/environments/ocp-play/hosts.yaml as it did not pass its verify_file() method
Set default localhost to localhost
Parsed /home/openshift/git/cluster-setup/ansible/environments/ocp-play/hosts.yaml inventory source with yaml plugin
Loading collection kubevirt.core from /home/openshift/.ansible/collections/ansible_collections/kubevirt/core
redirecting (type: callback) ansible.builtin.unixy to community.general.unixy
Loading collection community.general from /usr/lib/python3.11/site-packages/ansible_collections/community/general
redirecting (type: callback) ansible.builtin.unixy to community.general.unixy
Loading callback plugin community.general.unixy of type stdout, v2.0 from /usr/lib/python3.11/site-packages/ansible_collections/community/general/plugins/callback/unixy.py
redirecting (type: callback) ansible.builtin.profile_roles to ansible.posix.profile_roles
Loading collection ansible.posix from /usr/lib/python3.11/site-packages/ansible_collections/ansible/posix
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
Loading callback plugin ansible.posix.profile_roles of type aggregate, v2.0 from /usr/lib/python3.11/site-packages/ansible_collections/ansible/posix/plugins/callback/profile_roles.py
Executing playbook vm_create.yaml
Positional arguments: playbooks/kubevirt/vm_create.yaml
verbosity: 4
connection: smart
timeout: 10
become_method: sudo
tags: ('all',)
diff: True
inventory: ('/home/openshift/git/cluster-setup/ansible/environments/ocp-play',)
forks: 5
1 plays in playbooks/kubevirt/vm_create.yaml

- Create VM on hosts: localhost -
Trying secret FileVaultSecret(filename='/home/openshift/.vault_pass.txt') for vault_id=default
Create VM...
Tuesday 28 November 2023  16:40:44 +0100 (0:00:00.030)       0:00:00.030 ******
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: openshift
<localhost> EXEC /bin/sh -c 'echo ~openshift && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/openshift/.ansible/tmp `"&& mkdir "` echo /home/openshift/.ansible/tmp/ansible-tmp-1701186044.463044-2147249-28832670259323 `" && echo ansible-tmp-1701186044.463044-2147249-28832670259323="` echo /home/openshift/.ansible/tmp/ansible-tmp-1701186044.463044-2147249-28832670259323 `" ) && sleep 0'
Loading collection kubernetes.core from /home/openshift/.ansible/collections/ansible_collections/kubernetes/core
Loading collection cloud.common from /home/openshift/.ansible/collections/ansible_collections/cloud/common
<localhost> EXEC /bin/sh -c 'rm -f -r /home/openshift/.ansible/tmp/ansible-tmp-1701186044.463044-2147249-28832670259323/ > /dev/null 2>&1 && sleep 0'
  localhost failed: {
    "msg": "Could not find imported module support code for ansible_collections.kubevirt.core.plugins.modules.kubevirt_vm.  Looked for (['ansible_collections.kubernetes.core.plugins.module_utils.k8s.core.AnsibleK8SModule', 'ansible_collections.kubernetes.core.plugins.module_utils.k8s.core'])"
}

- Play recap -
  localhost                  : ok=0    changed=0    unreachable=0    failed=1    rescued=0    ignored=0
Tuesday 28 November 2023  16:40:44 +0100 (0:00:00.094)       0:00:00.125 ******
===============================================================================
kubevirt.core.kubevirt_vm ----------------------------------------------- 0.09s
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
total ------------------------------------------------------------------- 0.09s
0xFelix commented 11 months ago

Have you tried with kubernetes.core == 2.4.0?

kubevirt-bot commented 8 months ago

Issues go stale after 90d of inactivity. Mark the issue as fresh with /remove-lifecycle stale. Stale issues rot after an additional 30d of inactivity and eventually close.

If this issue is safe to close now please do so with /close.

/lifecycle stale

kubevirt-bot commented 7 months ago

Stale issues rot after 30d of inactivity. Mark the issue as fresh with /remove-lifecycle rotten. Rotten issues close after an additional 30d of inactivity.

If this issue is safe to close now please do so with /close.

/lifecycle rotten

kubevirt-bot commented 6 months ago

Rotten issues close after 30d of inactivity. Reopen the issue with /reopen. Mark the issue as fresh with /remove-lifecycle rotten.

/close

kubevirt-bot commented 6 months ago

@kubevirt-bot: Closing this issue.

In response to [this](https://github.com/kubevirt/kubevirt.core/issues/34#issuecomment-2080460633): >Rotten issues close after 30d of inactivity. >Reopen the issue with `/reopen`. >Mark the issue as fresh with `/remove-lifecycle rotten`. > >/close Instructions for interacting with me using PR comments are available [here](https://git.k8s.io/community/contributors/guide/pull-requests.md). If you have questions or suggestions related to my behavior, please file an issue against the [kubernetes/test-infra](https://github.com/kubernetes/test-infra/issues/new?title=Prow%20issue:) repository.