ansible-collections / netapp.ontap

Ansible collection to support NetApp ONTAP configuration.
https://galaxy.ansible.com/netapp/ontap
GNU General Public License v3.0
54 stars 35 forks source link

na_ontap_rest_info gather_subset=cluster_node_info fails in precluster #90

Closed JamesPGriffith closed 2 years ago

JamesPGriffith commented 2 years ago

Summary

When attempting to pull the node data information precluster to use building the cluster, it fails.

Component Name

na_ontap_rest_info

Ansible Version

user@jumphost:~/ontap-setup-automation$ ansible --version
ansible [core 2.13.2]
  config file = /home/user/ontap-setup-automation/ansible.cfg
  configured module search path = ['/home/user/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /home/user/.local/lib/python3.8/site-packages/ansible
  ansible collection location = /home/user/.ansible/collections:/usr/share/ansible/collections
  executable location = /home/user/.local/bin/ansible
  python version = 3.8.10 (default, Jun 22 2022, 20:18:18) [GCC 9.4.0]
  jinja version = 3.1.2
  libyaml = True
user@jumphost:~/ontap-setup-automation$

ONTAP Collection Version

user@jumphost:~/ontap-setup-automation$ ansible-galaxy collection list

# /home/user/.local/lib/python3.8/site-packages/ansible_collections
Collection                    Version
----------------------------- -------
amazon.aws                    3.3.1  
ansible.netcommon             3.0.1  
ansible.posix                 1.4.0  
ansible.utils                 2.6.1  
ansible.windows               1.10.0 
arista.eos                    5.0.1  
awx.awx                       21.2.0 
azure.azcollection            1.13.0 
check_point.mgmt              2.3.0  
chocolatey.chocolatey         1.3.0  
cisco.aci                     2.2.0  
cisco.asa                     3.1.0  
cisco.dnac                    6.5.0  
cisco.intersight              1.0.19 
cisco.ios                     3.2.0  
cisco.iosxr                   3.2.0  
cisco.ise                     2.5.0  
cisco.meraki                  2.10.0 
cisco.mso                     2.0.0  
cisco.nso                     1.0.3  
cisco.nxos                    3.1.0  
cisco.ucs                     1.8.0  
cloud.common                  2.1.2  
cloudscale_ch.cloud           2.2.2  
community.aws                 3.4.0  
community.azure               1.1.0  
community.ciscosmb            1.0.5  
community.crypto              2.4.0  
community.digitalocean        1.21.0 
community.dns                 2.2.1  
community.docker              2.7.0  
community.fortios             1.0.0  
community.general             5.3.0  
community.google              1.0.0  
community.grafana             1.5.0  
community.hashi_vault         3.0.0  
community.hrobot              1.4.0  
community.libvirt             1.1.0  
community.mongodb             1.4.1  
community.mysql               3.3.0  
community.network             4.0.1  
community.okd                 2.2.0  
community.postgresql          2.1.5  
community.proxysql            1.4.0  
community.rabbitmq            1.2.1  
community.routeros            2.1.0  
community.sap                 1.0.0  
community.sap_libs            1.1.0  
community.skydive             1.0.0  
community.sops                1.2.3  
community.vmware              2.7.0  
community.windows             1.10.0 
community.zabbix              1.7.0  
containers.podman             1.9.4  
cyberark.conjur               1.1.0  
cyberark.pas                  1.0.14 
dellemc.enterprise_sonic      1.1.1  
dellemc.openmanage            5.5.0  
dellemc.os10                  1.1.1  
dellemc.os6                   1.0.7  
dellemc.os9                   1.0.4  
f5networks.f5_modules         1.18.0 
fortinet.fortimanager         2.1.5  
fortinet.fortios              2.1.6  
frr.frr                       2.0.0  
gluster.gluster               1.0.2  
google.cloud                  1.0.2  
hetzner.hcloud                1.8.1  
hpe.nimble                    1.1.4  
ibm.qradar                    2.0.0  
infinidat.infinibox           1.3.3  
infoblox.nios_modules         1.3.0  
inspur.sm                     2.0.0  
junipernetworks.junos         3.1.0  
kubernetes.core               2.3.2  
mellanox.onyx                 1.0.0  
netapp.aws                    21.7.0 
netapp.azure                  21.10.0
netapp.cloudmanager           21.18.0
netapp.elementsw              21.7.0 
netapp.ontap                  21.20.0
netapp.storagegrid            21.10.0
netapp.um_info                21.8.0 
netapp_eseries.santricity     1.3.0  
netbox.netbox                 3.7.1  
ngine_io.cloudstack           2.2.4  
ngine_io.exoscale             1.0.0  
ngine_io.vultr                1.1.2  
openstack.cloud               1.8.0  
openvswitch.openvswitch       2.1.0  
ovirt.ovirt                   2.1.0  
purestorage.flasharray        1.13.0 
purestorage.flashblade        1.9.0  
purestorage.fusion            1.0.2  
sensu.sensu_go                1.13.1 
servicenow.servicenow         1.0.6  
splunk.es                     2.0.0  
t_systems_mms.icinga_director 1.30.0 
theforeman.foreman            3.4.0  
vmware.vmware_rest            2.2.0  
vyos.vyos                     3.0.1  
wti.remote                    1.0.4  

# /home/user/.ansible/collections/ansible_collections
Collection        Version
----------------- -------
community.general 5.4.0  
netapp.ontap      21.21.0
user@jumphost:~/ontap-setup-automation$

ONTAP Version

user@jumphost:~/ontap-setup-automation$ ssh admin@cluster1-01
Password:
cluster1-01::> version
NetApp Release 9.8P10: Fri Feb 04 19:51:21 UTC 2022

Notice: Showing the version for the local node; the cluster-wide version could not be determined.

cluster1-01::>

Playbook

ansible localhost -m netapp.ontap.na_ontap_rest_info -a "validate_certs=no hostname=cluster1-01 username=admin password=Netapp1! gather_subset=cluster_node_info"

Steps to Reproduce

ansible localhost -m netapp.ontap.na_ontap_rest_info -a "validate_certs=no hostname=cluster1-01 username=admin password=Netapp1! gather_subset=cluster_node_info"

Expected Results

I expect at least this output:

user@jumphost:~/ontap-setup-automation$ ansible localhost -m netapp.ontap.na_ontap_restit -a "validate_certs=no hostname=cluster1-01 username=admin password=Netapp1! api=/cluster/nodes"
[WARNING]: No inventory was parsed, only implicit localhost is available
localhost | CHANGED => {
    "changed": true,
    "response": {
        "num_records": 2,
        "records": [
            {
                "name": "cluster1-02",
                "uuid": "8eb0365d-173f-11ed-954d-00a0985cb457"
            },
            {
                "name": "cluster1-01",
                "uuid": "9a968ea4-173f-11ed-a196-00a0985d172e"
            }
        ]
    },
    "status_code": 200
}
user@jumphost:~/ontap-setup-automation$

But this is output is what I was actually looking for:

user@jumphost:~/ontap-setup-automation$ ansible localhost -m netapp.ontap.na_ontap_restit -a "validate_certs=no hostname=cluster1-01 username=admin password=Netapp1! api=/cluster/nodes?fields=*"
[WARNING]: No inventory was parsed, only implicit localhost is available
localhost | CHANGED => {
    "changed": true,
    "response": {
        "num_records": 2,
        "records": [
            {
                "cluster_interfaces": [
                    {
                        "ip": {
                            "address": "169.254.102.148"
                        }
                    }
                ],
                "ha": {
                    "partners": [
                        {
                            "name": "cluster1-01",
                            "uuid": "9a968ea4-173f-11ed-a196-00a0985d172e"
                        }
                    ]
                },
                "membership": "available",
                "model": "FAS8040",
                "name": "cluster1-02",
                "serial_number": "701427000490",
                "state": "up",
                "uuid": "8eb0365d-173f-11ed-954d-00a0985cb457",
                "version": {
                    "full": "9.8P10",
                    "generation": 9,
                    "major": 8,
                    "minor": 0
                }
            },
            {
                "cluster_interfaces": [
                    {
                        "ip": {
                            "address": "169.254.241.186"
                        }
                    }
                ],
                "ha": {
                    "partners": [
                        {
                            "name": "cluster1-02",
                            "uuid": "8eb0365d-173f-11ed-954d-00a0985cb457"
                        }
                    ]
                },
                "management_interfaces": [
                    {
                        "ip": {
                            "address": "192.168.0.111"
                        },
                        "name": "mgmt1"
                    }
                ],
                "membership": "available",
                "metrocluster": {
                    "type": "fc"
                },
                "model": "FAS8040",
                "name": "cluster1-01",
                "serial_number": "701427000483",
                "state": "up",
                "uuid": "9a968ea4-173f-11ed-a196-00a0985d172e",
                "version": {
                    "full": "9.8P10",
                    "generation": 9,
                    "major": 8,
                    "minor": 0
                }
            }
        ]
    },
    "status_code": 200
}
user@jumphost:~/ontap-setup-automation$

Actual Results

user@jumphost:~/ontap-setup-automation$ ansible localhost -m netapp.ontap.na_ontap_rest_info -a "validate_certs=no hostname=cluster1-01 username=admin password=Netapp1! gather_subset=cluster_node_info" -vvvv
ansible [core 2.13.2]
  config file = /home/user/ontap-setup-automation/ansible.cfg
  configured module search path = ['/home/user/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /home/user/.local/lib/python3.8/site-packages/ansible
  ansible collection location = /home/user/.ansible/collections:/usr/share/ansible/collections
  executable location = /home/user/.local/bin/ansible
  python version = 3.8.10 (default, Jun 22 2022, 20:18:18) [GCC 9.4.0]
  jinja version = 3.1.2
  libyaml = True
Using /home/user/ontap-setup-automation/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Skipping due to inventory source not existing or not being readable by the current user
script declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Skipping due to inventory source not existing or not being readable by the current user
yaml declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Skipping due to inventory source not existing or not being readable by the current user
ini declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Skipping due to inventory source not existing or not being readable by the current user
toml declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
[WARNING]: No inventory was parsed, only implicit localhost is available
Loading callback plugin minimal of type stdout, v2.0 from /home/user/.local/lib/python3.8/site-packages/ansible/plugins/callback/minimal.py
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
META: ran handlers
Loading collection netapp.ontap from /home/user/.ansible/collections/ansible_collections/netapp/ontap
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: user
<127.0.0.1> EXEC /bin/sh -c 'echo ~user && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/user/.ansible/tmp `"&& mkdir "` echo /home/user/.ansible/tmp/ansible-tmp-1660142573.1082475-67364-24143402545514 `" && echo ansible-tmp-1660142573.1082475-67364-24143402545514="` echo /home/user/.ansible/tmp/ansible-tmp-1660142573.1082475-67364-24143402545514 `" ) && sleep 0'
Using module file /home/user/.ansible/collections/ansible_collections/netapp/ontap/plugins/modules/na_ontap_rest_info.py
<127.0.0.1> PUT /home/user/.ansible/tmp/ansible-local-67360i_kodpdw/tmpawnas6ew TO /home/user/.ansible/tmp/ansible-tmp-1660142573.1082475-67364-24143402545514/AnsiballZ_na_ontap_rest_info.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /home/user/.ansible/tmp/ansible-tmp-1660142573.1082475-67364-24143402545514/ /home/user/.ansible/tmp/ansible-tmp-1660142573.1082475-67364-24143402545514/AnsiballZ_na_ontap_rest_info.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /home/user/.ansible/tmp/ansible-tmp-1660142573.1082475-67364-24143402545514/AnsiballZ_na_ontap_rest_info.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /home/user/.ansible/tmp/ansible-tmp-1660142573.1082475-67364-24143402545514/ > /dev/null 2>&1 && sleep 0'
localhost | FAILED! => {
    "changed": false,
    "invocation": {
        "module_args": {
            "cert_filepath": null,
            "feature_flags": {},
            "fields": null,
            "gather_subset": [
                "cluster_node_info"
            ],
            "hostname": "cluster1-01",
            "http_port": null,
            "https": false,
            "key_filepath": null,
            "max_records": 1024,
            "ontapi": null,
            "owning_resource": null,
            "parameters": null,
            "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
            "state": null,
            "use_python_keys": false,
            "use_rest": "auto",
            "username": "admin",
            "validate_certs": false
        }
    },
    "msg": {
        "code": "9241607",
        "message": "Only POST/OPTIONS on /api/cluster, GET/HEAD/OPTIONS on /api/cluster/nodes, or calls on /api/cluster/jobs are available in precluster."
    }
}
user@jumphost:~/ontap-setup-automation$
lonico commented 2 years ago

fixed in 21.23.0: DEVOPS-5338 na_ontap_rest_info - new option ignore_api_errors to report error in subset rather than breaking execution.

carchi8py commented 2 years ago

This has been fixed with the 21.23.0 release.