dell / dellemc-openmanage-ansible-modules

Dell OpenManage Ansible Modules
GNU General Public License v3.0
329 stars 162 forks source link

dellemc_idrac_storage_volume: Raid 1 fail on BOSS adapter. #149

Closed janr7 closed 4 years ago

janr7 commented 4 years ago

Hi

We would like to be able to use dellemc_idrac_storage_volume to standardize on as module to create our Raid VD.

Could you please assist in achieving this? Thanks so much.

Used: https://galaxy.ansible.com/download/dellemc-openmanage-2.1.1.tar.gz Model | PowerEdge R740xd BIOS Version | 2.8.1 iDRAC Firmware Version | 4.22.00.00 No OS installed on server.

Output from -vvv (seems to run clean).

fatal: [localhost]: FAILED! => { "changed": false, "invocation": { "module_args": { "capacity": null, "controller_id": "AHCI.Slot.2-1", "disk_cache_policy": "Default", "idrac_ip": "10.145.103.135", "idrac_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "idrac_port": 443, "idrac_user": "Laboradmin", "media_type": null, "number_dedicated_hot_spare": 0, "protocol": null, "raid_init_operation": null, "raid_reset_config": "True", "read_cache_policy": "NoReadAhead", "span_depth": 1, "span_length": 1, "state": "create", "stripe_size": 65536, "volume_id": null, "volume_type": "RAID 0", "volumes": [ { "disk_cache_policy": "Default", "drives": { "id": [ "Disk.Direct.0-0:AHCI.Slot.2-1", "Disk.Direct.1-1:AHCI.Slot.2-1" ] }, "name": "VD_R1_1_a", "raid_init_operation": "Fast", "span_depth": 1, "span_length": 1, "stripe_size": 65536, "volume_type": "RAID 1" } ], "write_cache_policy": "WriteThrough" } }, "msg": "Failed to perform storage operation" }

Output from View ok: [localhost] => { "result": { "changed": false, "failed": false, "msg": "Successfully completed the view storage volume operation", "storage_status": { "Message": { "Controller": { "AHCI.Embedded.1-1": { "ControllerSensor": { "AHCI.Embedded.1-1": {} } }, "AHCI.Embedded.2-1": { "ControllerSensor": { "AHCI.Embedded.2-1": {} } }, "AHCI.Slot.2-1": { "ControllerSensor": { "AHCI.Slot.2-1": {} }, "PhysicalDisk": [ "Disk.Direct.0-0:AHCI.Slot.2-1", "Disk.Direct.1-1:AHCI.Slot.2-1" ] }, "NonRAID.Slot.6-1": { "ControllerSensor": { "NonRAID.Slot.6-1": {} }, "Enclosure": { "Enclosure.Internal.0-1:NonRAID.Slot.6-1": { "EnclosureSensor": { "Enclosure.Internal.0-1:NonRAID.Slot.6-1": {} }, "PhysicalDisk": [ "Disk.Bay.0:Enclosure.Internal.0-1:NonRAID.Slot.6-1" ] } } } }, "PCIeSSDExtender": { "PCIeExtender.Slot.3": { "PCIeSSDDisk": [ "Disk.Bay.20:Enclosure.Internal.0-1:PCIeExtender.Slot.3", "Disk.Bay.22:Enclosure.Internal.0-1:PCIeExtender.Slot.3", "Disk.Bay.23:Enclosure.Internal.0-1:PCIeExtender.Slot.3", "Disk.Bay.21:Enclosure.Internal.0-1:PCIeExtender.Slot.3" ] }, "PCIeExtender.Slot.4": { "PCIeSSDDisk": [ "Disk.Bay.19:Enclosure.Internal.0-1:PCIeExtender.Slot.4", "Disk.Bay.18:Enclosure.Internal.0-1:PCIeExtender.Slot.4" ] }, "PCIeExtender.Slot.8": {} } }, "Status": "Success" } } }

qwertzlbert commented 4 years ago

Hi,

I think your issue is your configured 'span_length: 1' option. RAID 1 requires a span length of 2, as you want to create a mirror with 2 disks, otherwise the creation will fail. I will post a more detailed example/execution log later.

Best Regards

qwertzlbert commented 4 years ago

Here's the example as promised

Example playbook configure-boss.yml:

---

- name: boss RAID test
  hosts: all
  gather_facts: no
  connection: local
  collections:
    - dellemc.openmanage
  tasks:
    - name: view Storage
      dellemc_idrac_storage_volume:
        idrac_ip: "{{ idrac_ip }}"
        idrac_user: "{{ idrac_user }}"
        idrac_password: "{{ idrac_user_password }}"
        state: view

    - name: create RAID volume
      dellemc_idrac_storage_volume:
        idrac_ip: "{{ idrac_ip }}"
        idrac_user: "{{ idrac_user }}"
        idrac_password: "{{ idrac_user_password }}"
        raid_reset_config: True
        state: create
        controller_id: AHCI.Slot.5-1
        volumes:
          - name: "Test-R1"
            volume_type: RAID 1
            disk_cache_policy: Default
            raid_init_operation: Fast
            span_length: 2
            span_depth: 1
            stripe_size: 65536
            drives:
              id: ["Disk.Direct.0-0:AHCI.Slot.5-1","Disk.Direct.1-1:AHCI.Slot.5-1"]
      register: result_raid

    - debug:
        var: result_raid

Stdout of ansible-playbook -i localhost configure-boss.yml -vvv:

(venv) qwertzlbert@qwertzlbert-ansible:~/ansible/boss_test$ ansible-playbook -i hosts configure-boss.yml -vvv
ansible-playbook 2.9.13
  config file = /home/qwertzlbert/ansible/boss_test/ansible.cfg
  configured module search path = ['/home/qwertzlbert/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /home/qwertzlbert/ansible/venv/lib/python3.8/site-packages/ansible
  executable location = /home/qwertzlbert/ansible/venv/bin/ansible-playbook
  python version = 3.8.2 (default, Jul 16 2020, 14:00:26) [GCC 9.3.0]
Using /home/qwertzlbert/ansible/boss_test/ansible.cfg as config file
host_list declined parsing /home/qwertzlbert/ansible/boss_test/hosts as it did not pass its verify_file() method
script declined parsing /home/qwertzlbert/ansible/boss_test/hosts as it did not pass its verify_file() method
auto declined parsing /home/qwertzlbert/ansible/boss_test/hosts as it did not pass its verify_file() method
Parsed /home/qwertzlbert/ansible/boss_test/hosts inventory source with ini plugin

PLAYBOOK: configure-boss.yml ************************************************************************************************
1 plays in configure-boss.yml

PLAY [boss RAID test] *******************************************************************************************************
META: ran handlers

TASK [view Storage] *********************************************************************************************************
task path: /home/qwertzlbert/ansible/boss_test/configure-boss.yml:10
<test> ESTABLISH LOCAL CONNECTION FOR USER: qwertzlbert
<test> EXEC /bin/sh -c 'echo ~qwertzlbert && sleep 0'
<test> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/qwertzlbert/.ansible/tmp `"&& mkdir "` echo /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994749.6851676-34121-170760483132753 `" && echo ansible-tmp-1598994749.6851676-34121-170760483132753="` echo /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994749.6851676-34121-170760483132753 `" ) && sleep 0'
<test> Attempting python interpreter discovery
<test> EXEC /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'python3.5'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'python2.6'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0'
<test> EXEC /bin/sh -c '/usr/bin/python3 && sleep 0'
Using module file /home/qwertzlbert/ansible/boss_test/collections/ansible_collections/dellemc/openmanage/plugins/modules/dellemc_idrac_storage_volume.py
<test> PUT /home/qwertzlbert/.ansible/tmp/ansible-local-34116cvgtz82v/tmp9320ag_q TO /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994749.6851676-34121-170760483132753/AnsiballZ_dellemc_idrac_storage_volume.py
<test> EXEC /bin/sh -c 'chmod u+x /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994749.6851676-34121-170760483132753/ /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994749.6851676-34121-170760483132753/AnsiballZ_dellemc_idrac_storage_volume.py && sleep 0'
<test> EXEC /bin/sh -c '/usr/bin/python3 /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994749.6851676-34121-170760483132753/AnsiballZ_dellemc_idrac_storage_volume.py && sleep 0'
<test> EXEC /bin/sh -c 'rm -f -r /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994749.6851676-34121-170760483132753/ > /dev/null 2>&1 && sleep 0'
ok: [test] => {
    "ansible_facts": {
        "discovered_interpreter_python": "/usr/bin/python3"
    },
    "changed": false,
    "invocation": {
        "module_args": {
            "capacity": null,
            "controller_id": null,
            "disk_cache_policy": "Default",
            "idrac_ip": "idrac-ip",
            "idrac_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
            "idrac_port": 443,
            "idrac_user": "idrac_user",
            "media_type": null,
            "number_dedicated_hot_spare": 0,
           "protocol": null,
            "raid_init_operation": null,
            "raid_reset_config": "False",
            "read_cache_policy": "NoReadAhead",
            "span_depth": 1,
            "span_length": 1,
            "state": "view",
            "stripe_size": 65536,
            "volume_id": null,
            "volume_type": "RAID 0",
            "volumes": null,
            "write_cache_policy": "WriteThrough"
        }
    },
    "msg": "Successfully completed the view storage volume operation",
    "storage_status": {
        "Message": {
            "Controller": {
                "AHCI.Embedded.1-1": {
                    "ControllerSensor": {
                        "AHCI.Embedded.1-1": {}
                    }
                },
                "AHCI.Embedded.2-1": {
                    "ControllerSensor": {
                        "AHCI.Embedded.2-1": {}
                    }
                },
                "AHCI.Slot.5-1": {
                    "ControllerSensor": {
                        "AHCI.Slot.5-1": {}
                    },
                    "PhysicalDisk": [
                        "Disk.Direct.0-0:AHCI.Slot.5-1",
                        "Disk.Direct.1-1:AHCI.Slot.5-1"
                    ],
                    "VirtualDisk": {
                        "Disk.Virtual.0:AHCI.Slot.5-1": {
                            "PhysicalDisk": [
                                "Disk.Direct.0-0:AHCI.Slot.5-1",
                                "Disk.Direct.1-1:AHCI.Slot.5-1"
                            ]
                        }
                    }
                },
                "RAID.Integrated.1-1": {
                    "ControllerSensor": {
                        "RAID.Integrated.1-1": {
                            "ControllerBattery": [
                                "Battery.Integrated.1:RAID.Integrated.1-1"
                            ]
                        }
                    },
                    "Enclosure": {
                        "Enclosure.Internal.0-1:RAID.Integrated.1-1": {
                            "EnclosureSensor": {
                                "Enclosure.Internal.0-1:RAID.Integrated.1-1": {}
                            },
                            "PhysicalDisk": [
                                "Disk.Bay.0:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.1:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.2:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.3:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.4:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.5:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.6:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.7:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.8:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.9:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.10:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.11:Enclosure.Internal.0-1:RAID.Integrated.1-1"
                            ]
                        }
                    },
                    "VirtualDisk": {
                        "Disk.Virtual.0:RAID.Integrated.1-1": {
                            "PhysicalDisk": [
                                "Disk.Bay.1:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.2:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.4:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.5:Enclosure.Internal.0-1:RAID.Integrated.1-1",
                                "Disk.Bay.10:Enclosure.Internal.0-1:RAID.Integrated.1-1"
                            ]
                        }
                    }
                }
            }
        },
        "Status": "Success"
    }
}

TASK [create RAID volume] ***************************************************************************************************
task path: /home/qwertzlbert/ansible/boss_test/configure-boss.yml:17
<test> ESTABLISH LOCAL CONNECTION FOR USER: qwertzlbert
<test> EXEC /bin/sh -c 'echo ~qwertzlbert && sleep 0'
<test> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/qwertzlbert/.ansible/tmp `"&& mkdir "` echo /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994777.5390363-34160-276495466052099 `" && echo ansible-tmp-1598994777.5390363-34160-276495466052099="` echo /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994777.5390363-34160-276495466052099 `" ) && sleep 0'
Using module file /home/qwertzlbert/ansible/boss_test/collections/ansible_collections/dellemc/openmanage/plugins/modules/dellemc_idrac_storage_volume.py
<test> PUT /home/qwertzlbert/.ansible/tmp/ansible-local-34116cvgtz82v/tmpbj6fk9yg TO /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994777.5390363-34160-276495466052099/AnsiballZ_dellemc_idrac_storage_volume.py
<test> EXEC /bin/sh -c 'chmod u+x /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994777.5390363-34160-276495466052099/ /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994777.5390363-34160-276495466052099/AnsiballZ_dellemc_idrac_storage_volume.py && sleep 0'
<test> EXEC /bin/sh -c '/usr/bin/python3 /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994777.5390363-34160-276495466052099/AnsiballZ_dellemc_idrac_storage_volume.py && sleep 0'
<test> EXEC /bin/sh -c 'rm -f -r /home/qwertzlbert/.ansible/tmp/ansible-tmp-1598994777.5390363-34160-276495466052099/ > /dev/null 2>&1 && sleep 0'
[WARNING]: The value True (type bool) in a string field was converted to 'True' (type string). If this does not look like
what you expect, quote the entire value to ensure it does not change.
changed: [test] => {
    "changed": true,
    "invocation": {
        "module_args": {
            "capacity": null,
            "controller_id": "AHCI.Slot.5-1",
            "disk_cache_policy": "Default",
            "idrac_ip": "idrac-ip",
            "idrac_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
            "idrac_port": 443,
            "idrac_user": "idrac_user",
            "media_type": null,
            "number_dedicated_hot_spare": 0,
            "protocol": null,
            "raid_init_operation": null,
            "raid_reset_config": "True",
            "read_cache_policy": "NoReadAhead",
            "span_depth": 1,
            "span_length": 1,
            "state": "create",
            "stripe_size": 65536,
            "volume_id": null,
            "volume_type": "RAID 0",
            "volumes": [
                {
                    "disk_cache_policy": "Default",
                    "drives": {
                        "id": [
                            "Disk.Direct.0-0:AHCI.Slot.5-1",
                            "Disk.Direct.1-1:AHCI.Slot.5-1"
                        ]
                    },
                    "name": "Test-R1",
                    "raid_init_operation": "Fast",
                    "span_depth": 1,
                    "span_length": 2,
                    "stripe_size": 65536,
                    "volume_type": "RAID 1"
                }
            ],
            "write_cache_policy": "WriteThrough"
        }
    },
    "msg": "Successfully completed the create storage volume operation",
    "storage_status": {
        "@odata.context": "/redfish/v1/$metadata#DellJob.DellJob",
        "@odata.id": "/redfish/v1/Managers/iDRAC.Embedded.1/Jobs/JID_989948526562",
        "@odata.type": "#DellJob.v1_0_2.DellJob",
        "CompletionTime": "2020-09-01T23:23:24",
        "Description": "Job Instance",
        "EndTime": null,
        "Id": "JID_989948526562",
        "JobState": "Completed",
        "JobType": "ImportConfiguration",
        "Message": "Successfully imported and applied Server Configuration Profile.",
        "MessageArgs": [],
        "MessageArgs@odata.count": 0,
        "MessageId": "SYS053",
        "Name": "Import Configuration",
        "PercentComplete": 100,
        "StartTime": "TIME_NOW",
        "Status": "Success",
       "TargetSettingsURI": null,
        "retval": true
    }
}

TASK [debug] ****************************************************************************************************************
task path: /home/qwertzlbert/ansible/boss_test/configure-boss.yml:37
ok: [test] => {
    "result_raid": {
        "changed": true,
        "failed": false,
        "msg": "Successfully completed the create storage volume operation",
        "storage_status": {
            "@odata.context": "/redfish/v1/$metadata#DellJob.DellJob",
            "@odata.id": "/redfish/v1/Managers/iDRAC.Embedded.1/Jobs/JID_989948526562",
            "@odata.type": "#DellJob.v1_0_2.DellJob",
            "CompletionTime": "2020-09-01T23:23:24",
            "Description": "Job Instance",
            "EndTime": null,
            "Id": "JID_989948526562",
            "JobState": "Completed",
            "JobType": "ImportConfiguration",
            "Message": "Successfully imported and applied Server Configuration Profile.",
            "MessageArgs": [],
            "MessageArgs@odata.count": 0,
            "MessageId": "SYS053",
            "Name": "Import Configuration",
            "PercentComplete": 100,
            "StartTime": "TIME_NOW",
            "Status": "Success",
            "TargetSettingsURI": null,
            "retval": true
        },
        "warnings": [
            "The value True (type bool) in a string field was converted to 'True' (type string). If this does not look like what you expect, quote the entire value to ensure it does not change."
        ]
    }
}
META: ran handlers
META: ran handlers

PLAY RECAP ******************************************************************************************************************
test                       : ok=3    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0
janr7 commented 4 years ago

Hi

Modified play to use "span_lengh 2"

A view was requested before this run, there were no existing VDs at all.

The result while there are no Raid VD on the controller with span_length 2:

Thanks so much.

ok: [localhost] => { "result_setup_raid": { "changed": false, "failed": false, "msg": "No changes found to commit!", "storage_status": { "Message": "No changes found to commit!", "Status": "Success" } } }

Here the output with span_length 2

TASK [Create Raid boot volume] *** task path: /global/instserv/ansible/playbooks/dellemc_idrac_storage_volume_create_AHCI_raid1.yml:14 3351 1599032398.28606: sending task start callback 3351 1599032398.28632: entering _queue_task() for localhost/dellemc_idrac_storage_volume 3351 1599032398.28677: Creating lock for dellemc_idrac_storage_volume 3351 1599032398.35171: worker is 1 (out of 1 available) 3351 1599032398.35462: exiting _queue_task() for localhost/dellemc_idrac_storage_volume 3351 1599032398.35894: done queuing things up, now waiting for results queue to drain 3351 1599032398.35930: waiting for pending results... 3359 1599032398.36110: running TaskExecutor() for localhost/TASK: Create Raid boot volume 3359 1599032398.36568: in run() - task 00163e91-274a-4e76-e37a-000000000008 3359 1599032398.36893: calling self._execute() 3359 1599032398.45088: Loading FilterModule 'core' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/core.py 3359 1599032398.45610: Loading FilterModule 'gcp_kms_filters' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/gcp_kms_filters.py 3359 1599032398.52004: Loading FilterModule 'ipaddr' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/ipaddr.py 3359 1599032398.52117: Loading FilterModule 'json_query' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/json_query.py 3359 1599032398.52215: Loading FilterModule 'k8s' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/k8s.py 3359 1599032398.52329: Loading FilterModule 'mathstuff' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/mathstuff.py 3359 1599032398.52490: Loading FilterModule 'network' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/network.py 3359 1599032398.52576: Loading FilterModule 'urls' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/urls.py 3359 1599032398.52637: Loading FilterModule 'urlsplit' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/urlsplit.py 3359 1599032398.52953: Loading TestModule 'core' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/test/core.py 3359 1599032398.53024: Loading TestModule 'files' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/test/files.py 3359 1599032398.53082: Loading TestModule 'mathstuff' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/test/mathstuff.py 3359 1599032398.54154: trying /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/connection 3359 1599032398.54349: Loading Connection 'local' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/connection/local.py (found_in_cache=True, class_only=False) 3359 1599032398.54389: trying /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/shell 3359 1599032398.54477: Loading ShellModule 'sh' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 3359 1599032398.54494: Loading ShellModule 'sh' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 3359 1599032398.54737: Loading ActionModule 'normal' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/action/normal.py (searched paths: /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/action/pycache:/global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/action) 3359 1599032398.54789: starting attempt loop 3359 1599032398.54816: running the handler 3359 1599032398.54859: _low_level_execute_command(): starting 3359 1599032398.54866: _low_level_execute_command(): executing: /bin/sh -c 'echo ~root && sleep 0' <127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: root 3359 1599032398.54924: in local.exec_command() <127.0.0.1> EXEC /bin/sh -c 'echo ~root && sleep 0' 3359 1599032398.54966: opening command with Popen() 3359 1599032398.59364: done running command with Popen() 3359 1599032398.59427: getting output with communicate() 3359 1599032398.61709: done communicating 3359 1599032398.61726: done with local.exec_command() 3359 1599032398.61774: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 3359 1599032398.61839: _low_level_execute_command(): starting 3359 1599032398.61882: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "echo /root/.ansible/tmp"&& mkdir /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069 && echo ansible-tmp-1599032398.6180787-3359-114350257092069="echo /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069" ) && sleep 0' 3359 1599032398.61930: in local.exec_command() <127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "echo /root/.ansible/tmp"&& mkdir /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069 && echo ansible-tmp-1599032398.6180787-3359-114350257092069="echo /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069" ) && sleep 0' 3359 1599032398.61971: opening command with Popen() 3359 1599032398.66212: done running command with Popen() 3359 1599032398.66275: getting output with communicate() 3359 1599032398.73157: done communicating 3359 1599032398.73174: done with local.exec_command() 3359 1599032398.73222: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1599032398.6180787-3359-114350257092069=/root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069 , stderr= 3359 1599032398.73765: ANSIBALLZ: Using lock for dellemc_idrac_storage_volume 3359 1599032398.73799: ANSIBALLZ: Acquiring lock 3359 1599032398.73862: ANSIBALLZ: Lock acquired: 139923602968208 3359 1599032398.73904: ANSIBALLZ: Creating module 3359 1599032399.30050: ANSIBALLZ: Writing module into payload 3359 1599032399.30429: ANSIBALLZ: Writing module 3359 1599032399.30493: ANSIBALLZ: Renaming module 3359 1599032399.30508: ANSIBALLZ: Done creating module Using module file /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/modules/remote_management/dellemc/dellemc_idrac_storage_volume.py 3359 1599032399.30641: transferring module to remote /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/AnsiballZ_dellemc_idrac_storage_volume.py <127.0.0.1> PUT /root/.ansible/tmp/ansible-local-3351ejp22bxb/tmpq2zvxh3k TO /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/AnsiballZ_dellemc_idrac_storage_volume.py 3359 1599032399.30865: done transferring module to remote 3359 1599032399.30897: _low_level_execute_command(): starting 3359 1599032399.30919: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/ /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/AnsiballZ_dellemc_idrac_storage_volume.py && sleep 0' 3359 1599032399.30938: in local.exec_command() <127.0.0.1> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/ /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/AnsiballZ_dellemc_idrac_storage_volume.py && sleep 0' 3359 1599032399.30952: opening command with Popen() 3359 1599032399.34515: done running command with Popen() 3359 1599032399.34585: getting output with communicate() 3359 1599032399.37581: done communicating 3359 1599032399.37597: done with local.exec_command() 3359 1599032399.37624: _low_level_execute_command() done: rc=0, stdout=, stderr= 3359 1599032399.37668: _low_level_execute_command(): starting 3359 1599032399.37711: _low_level_execute_command(): executing: /bin/sh -c '/global/instserv/ansible/ansible_venv/bin/python /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/AnsiballZ_dellemc_idrac_storage_volume.py && sleep 0' 3359 1599032399.37751: in local.exec_command() <127.0.0.1> EXEC /bin/sh -c '/global/instserv/ansible/ansible_venv/bin/python /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/AnsiballZ_dellemc_idrac_storage_volume.py && sleep 0' 3359 1599032399.37780: opening command with Popen() 3359 1599032399.41718: done running command with Popen() 3359 1599032399.41819: getting output with communicate() 3359 1599032480.56542: done communicating 3359 1599032480.56582: done with local.exec_command() 3359 1599032480.56642: _low_level_execute_command() done: rc=0, stdout=WARN: Changing isFolder to false, as it is not directory msg_id=SYS043 Severity=Informational could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available' could not convert string to float: 'Not Available'

{"msg": "No changes found to commit!", "changed": false, "storage_status": {"Status": "Success", "Message": "No changes found to commit!"}, "invocation": {"module_args": {"idrac_ip": "10.145.103.135", "idrac_user": "Laboradmin", "idrac_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "raid_reset_config": "True", "state": "create", "controller_id": "AHCI.Slot.2-1", "volumes": [{"name": "VD_R1_1_a", "volume_type": "RAID 1", "disk_cache_policy": "Default", "raid_init_operation": "Fast", "span_length": 2, "span_depth": 1, "stripe_size": 65536, "drives": {"id": ["Disk.Direct.0-0:AHCI.Slot.2-1", "Disk.Direct.1-1:AHCI.Slot.2-1"]}}], "idrac_port": 443, "span_depth": 1, "span_length": 1, "number_dedicated_hot_spare": 0, "volume_type": "RAID 0", "disk_cache_policy": "Default", "write_cache_policy": "WriteThrough", "read_cache_policy": "NoReadAhead", "stripe_size": 65536, "volume_id": null, "capacity": null, "media_type": null, "protocol": null, "raid_init_operation": null}}} , stderr=ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:cannot compare with IntField ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:Invalid attribute FreeSize ERROR:omsdk.typemgr.ArrayType:cannot compare with IntField 3359 1599032480.57306: done with _execute_module (dellemc_idrac_storage_volume, {'idrac_ip': '10.145.103.135', 'idrac_user': 'Laboradmin', 'idrac_password': 'E267D465E8BE', 'raid_reset_config': 'True', 'state': 'create', 'controller_id': 'AHCI.Slot.2-1', 'volumes': [{'name': 'VD_R1_1_a', 'volume_type': 'RAID 1', 'disk_cache_policy': 'Default', 'raid_init_operation': 'Fast', 'span_length': 2, 'span_depth': 1, 'stripe_size': 65536, 'drives': {'id': ['Disk.Direct.0-0:AHCI.Slot.2-1', 'Disk.Direct.1-1:AHCI.Slot.2-1']}}], '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 3, '_ansible_version': '2.9.11', '_ansible_module_name': 'dellemc_idrac_storage_volume', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 3359 1599032480.57408: _low_level_execute_command(): starting 3359 1599032480.57521: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/ > /dev/null 2>&1 && sleep 0' 3359 1599032480.57567: in local.exec_command() <127.0.0.1> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1599032398.6180787-3359-114350257092069/ > /dev/null 2>&1 && sleep 0' 3359 1599032480.57644: opening command with Popen() 3359 1599032480.65611: done running command with Popen() 3359 1599032480.65746: getting output with communicate() 3359 1599032480.69597: done communicating 3359 1599032480.69629: done with local.exec_command() 3359 1599032480.69684: _low_level_execute_command() done: rc=0, stdout=, stderr= 3359 1599032480.69756: handler run complete 3359 1599032480.70333: attempt loop complete, returning result 3359 1599032480.70399: _execute() done 3359 1599032480.70425: dumping result to json 3359 1599032480.70496: done dumping result, returning 3359 1599032480.70584: done running TaskExecutor() for localhost/TASK: Create Raid boot volume [00163e91-274a-4e76-e37a-000000000008] 3359 1599032480.70694: sending task result for task 00163e91-274a-4e76-e37a-000000000008 3359 1599032480.70951: done sending task result for task 00163e91-274a-4e76-e37a-000000000008 3359 1599032480.71263: WORKER PROCESS EXITING ok: [localhost] => { "changed": false, "invocation": { "module_args": { "capacity": null, "controller_id": "AHCI.Slot.2-1", "disk_cache_policy": "Default", "idrac_ip": "10.145.103.135", "idrac_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "idrac_port": 443, "idrac_user": "Laboradmin", "media_type": null, "number_dedicated_hot_spare": 0, "protocol": null, "raid_init_operation": null, "raid_reset_config": "True", "read_cache_policy": "NoReadAhead", "span_depth": 1, "span_length": 1, "state": "create", "stripe_size": 65536, "volume_id": null, "volume_type": "RAID 0", "volumes": [ { "disk_cache_policy": "Default", "drives": { "id": [ "Disk.Direct.0-0:AHCI.Slot.2-1", "Disk.Direct.1-1:AHCI.Slot.2-1" ] }, "name": "VD_R1_1_a", "raid_init_operation": "Fast", "span_depth": 1, "span_length": 2, "stripe_size": 65536, "volume_type": "RAID 1" } ], "write_cache_policy": "WriteThrough" } }, "msg": "No changes found to commit!", "storage_status": { "Message": "No changes found to commit!", "Status": "Success" } } 3351 1599032480.72544: no more pending results, returning what we have 3351 1599032480.72583: results queue empty 3351 1599032480.72609: checking for any_errors_fatal 3351 1599032480.72643: done checking for any_errors_fatal 3351 1599032480.72668: checking for max_fail_percentage 3351 1599032480.72700: done checking for max_fail_percentage 3351 1599032480.72732: checking to see if all hosts have failed and the running result is not ok 3351 1599032480.72754: done checking to see if all hosts have failed 3351 1599032480.72779: getting the remaining hosts for this loop 3351 1599032480.72808: done getting the remaining hosts for this loop 3351 1599032480.72871: building list of next tasks for hosts 3351 1599032480.72895: getting the next task for host localhost 3351 1599032480.72946: done getting next task for host localhost 3351 1599032480.72983: ^ task is: TASK: debug 3351 1599032480.73018: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3351 1599032480.73047: done building task lists 3351 1599032480.73066: counting tasks in each state of execution 3351 1599032480.73103: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 3351 1599032480.73139: advancing hosts in ITERATING_TASKS 3351 1599032480.73159: starting to advance hosts 3351 1599032480.73188: getting the next task for host localhost 3351 1599032480.73225: done getting next task for host localhost 3351 1599032480.73263: ^ task is: TASK: debug 3351 1599032480.73288: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3351 1599032480.73317: done advancing hosts to next task 3351 1599032480.73595: Loading ActionModule 'debug' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/action/debug.py (searched paths: /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/action/pycache:/global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 3351 1599032480.73638: getting variables 3351 1599032480.73659: in VariableManager get_vars() 3351 1599032480.73797: Calling all_inventory to load vars for localhost 3351 1599032480.73837: Calling groups_inventory to load vars for localhost 3351 1599032480.73869: Calling all_plugins_inventory to load vars for localhost 3351 1599032480.74040: Loading VarsModule 'host_group_vars' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 3351 1599032480.74120: Calling all_plugins_play to load vars for localhost 3351 1599032480.74232: Loading VarsModule 'host_group_vars' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 3351 1599032480.74314: Calling groups_plugins_inventory to load vars for localhost 3351 1599032480.74439: Loading VarsModule 'host_group_vars' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 3351 1599032480.74482: Calling groups_plugins_play to load vars for localhost 3351 1599032480.74592: Loading VarsModule 'host_group_vars' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 3351 1599032480.74738: Loading VarsModule 'host_group_vars' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 3351 1599032480.74899: Loading VarsModule 'host_group_vars' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 3351 1599032480.75535: done with get_vars() 3351 1599032480.75593: done getting variables 3351 1599032480.75636: sending task start callback, copying the task so we can template it temporarily 3351 1599032480.75660: done copying, going to template now 3351 1599032480.75691: done templating 3351 1599032480.75719: here goes the callback...

TASK [debug] ***** task path: /global/instserv/ansible/playbooks/dellemc_idrac_storage_volume_create_AHCI_raid1.yml:37 3351 1599032480.75814: sending task start callback 3351 1599032480.75842: entering _queue_task() for localhost/debug 3351 1599032480.75873: Creating lock for debug 3351 1599032480.82758: worker is 1 (out of 1 available) 3351 1599032480.83058: exiting _queue_task() for localhost/debug 3351 1599032480.83493: done queuing things up, now waiting for results queue to drain 3351 1599032480.83529: waiting for pending results... 5325 1599032480.83747: running TaskExecutor() for localhost/TASK: debug 5325 1599032480.84192: in run() - task 00163e91-274a-4e76-e37a-000000000009 5325 1599032480.84536: calling self._execute() 5325 1599032480.85979: trying /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/connection 5325 1599032480.86433: Loading Connection 'local' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/connection/local.py (found_in_cache=True, class_only=False) 5325 1599032480.86537: trying /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/shell 5325 1599032480.86913: Loading ShellModule 'sh' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 5325 1599032480.86971: Loading ShellModule 'sh' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 5325 1599032480.87410: Loading ActionModule 'debug' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/action/debug.py (searched paths: /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/action/pycache:/global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 5325 1599032480.87482: starting attempt loop 5325 1599032480.87507: running the handler 5325 1599032480.94298: Loading FilterModule 'core' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/core.py 5325 1599032480.94719: Loading FilterModule 'gcp_kms_filters' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/gcp_kms_filters.py 5325 1599032480.99711: Loading FilterModule 'ipaddr' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/ipaddr.py 5325 1599032480.99810: Loading FilterModule 'json_query' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/json_query.py 5325 1599032480.99916: Loading FilterModule 'k8s' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/k8s.py 5325 1599032481.00037: Loading FilterModule 'mathstuff' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/mathstuff.py 5325 1599032481.00212: Loading FilterModule 'network' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/network.py 5325 1599032481.00315: Loading FilterModule 'urls' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/urls.py 5325 1599032481.00379: Loading FilterModule 'urlsplit' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/urlsplit.py 5325 1599032481.00707: Loading TestModule 'core' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/test/core.py 5325 1599032481.00768: Loading TestModule 'files' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/test/files.py 5325 1599032481.00825: Loading TestModule 'mathstuff' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/test/mathstuff.py 5325 1599032481.01446: Loading FilterModule 'core' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 5325 1599032481.01464: Loading FilterModule 'gcp_kms_filters' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/gcp_kms_filters.py (found_in_cache=True, class_only=False) 5325 1599032481.01481: Loading FilterModule 'ipaddr' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/ipaddr.py (found_in_cache=True, class_only=False) 5325 1599032481.01497: Loading FilterModule 'json_query' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/json_query.py (found_in_cache=True, class_only=False) 5325 1599032481.01514: Loading FilterModule 'k8s' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/k8s.py (found_in_cache=True, class_only=False) 5325 1599032481.01530: Loading FilterModule 'mathstuff' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 5325 1599032481.01546: Loading FilterModule 'network' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/network.py (found_in_cache=True, class_only=False) 5325 1599032481.01562: Loading FilterModule 'urls' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 5325 1599032481.01578: Loading FilterModule 'urlsplit' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 5325 1599032481.01653: Loading TestModule 'core' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 5325 1599032481.01669: Loading TestModule 'files' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 5325 1599032481.01685: Loading TestModule 'mathstuff' from /global/instserv/ansible/ansible_venv/lib/python3.7/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 5325 1599032481.01796: handler run complete 5325 1599032481.01817: attempt loop complete, returning result 5325 1599032481.01830: _execute() done 5325 1599032481.01842: dumping result to json 5325 1599032481.01858: done dumping result, returning 5325 1599032481.01877: done running TaskExecutor() for localhost/TASK: debug [00163e91-274a-4e76-e37a-000000000009] 5325 1599032481.01915: sending task result for task 00163e91-274a-4e76-e37a-000000000009 5325 1599032481.02006: done sending task result for task 00163e91-274a-4e76-e37a-000000000009 5325 1599032481.02019: WORKER PROCESS EXITING ok: [localhost] => { "result_setup_raid": { "changed": false, "failed": false, "msg": "No changes found to commit!", "storage_status": { "Message": "No changes found to commit!", "Status": "Success" } } } 3351 1599032481.02931: no more pending results, returning what we have 3351 1599032481.02971: results queue empty 3351 1599032481.03003: checking for any_errors_fatal 3351 1599032481.03043: done checking for any_errors_fatal 3351 1599032481.03069: checking for max_fail_percentage 3351 1599032481.03096: done checking for max_fail_percentage 3351 1599032481.03120: checking to see if all hosts have failed and the running result is not ok 3351 1599032481.03146: done checking to see if all hosts have failed 3351 1599032481.03176: getting the remaining hosts for this loop 3351 1599032481.03209: done getting the remaining hosts for this loop 3351 1599032481.03268: building list of next tasks for hosts 3351 1599032481.03287: getting the next task for host localhost 3351 1599032481.03348: done getting next task for host localhost 3351 1599032481.03376: ^ task is: TASK: meta (flush_handlers) 3351 1599032481.03415: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3351 1599032481.03442: done building task lists 3351 1599032481.03457: counting tasks in each state of execution 3351 1599032481.03502: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 3351 1599032481.03533: advancing hosts in ITERATING_TASKS 3351 1599032481.03549: starting to advance hosts 3351 1599032481.03586: getting the next task for host localhost 3351 1599032481.03625: done getting next task for host localhost 3351 1599032481.03658: ^ task is: TASK: meta (flush_handlers) 3351 1599032481.03683: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3351 1599032481.03703: done advancing hosts to next task META: ran handlers 3351 1599032481.03860: done queuing things up, now waiting for results queue to drain 3351 1599032481.03898: results queue empty 3351 1599032481.03919: checking for any_errors_fatal 3351 1599032481.03959: done checking for any_errors_fatal 3351 1599032481.03985: checking for max_fail_percentage 3351 1599032481.04002: done checking for max_fail_percentage 3351 1599032481.04037: checking to see if all hosts have failed and the running result is not ok 3351 1599032481.04055: done checking to see if all hosts have failed 3351 1599032481.04084: getting the remaining hosts for this loop 3351 1599032481.04119: done getting the remaining hosts for this loop 3351 1599032481.04168: building list of next tasks for hosts 3351 1599032481.04188: getting the next task for host localhost 3351 1599032481.04233: done getting next task for host localhost 3351 1599032481.04266: ^ task is: TASK: meta (flush_handlers) 3351 1599032481.04298: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3351 1599032481.04327: done building task lists 3351 1599032481.04341: counting tasks in each state of execution 3351 1599032481.04378: done counting tasks in each state of execution: num_setups: 0 num_tasks: 1 num_rescue: 0 num_always: 0 3351 1599032481.04408: advancing hosts in ITERATING_TASKS 3351 1599032481.04438: starting to advance hosts 3351 1599032481.04463: getting the next task for host localhost 3351 1599032481.04500: done getting next task for host localhost 3351 1599032481.04528: ^ task is: TASK: meta (flush_handlers) 3351 1599032481.04557: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, run_state=ITERATING_TASKS, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3351 1599032481.04578: done advancing hosts to next task META: ran handlers 3351 1599032481.04670: done queuing things up, now waiting for results queue to drain 3351 1599032481.04699: results queue empty 3351 1599032481.04717: checking for any_errors_fatal 3351 1599032481.04757: done checking for any_errors_fatal 3351 1599032481.04779: checking for max_fail_percentage 3351 1599032481.04799: done checking for max_fail_percentage 3351 1599032481.04822: checking to see if all hosts have failed and the running result is not ok 3351 1599032481.04850: done checking to see if all hosts have failed 3351 1599032481.04878: getting the remaining hosts for this loop 3351 1599032481.04904: done getting the remaining hosts for this loop 3351 1599032481.04953: building list of next tasks for hosts 3351 1599032481.04971: getting the next task for host localhost 3351 1599032481.05012: done getting next task for host localhost 3351 1599032481.05040: ^ task is: None 3351 1599032481.05063: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, run_state=ITERATING_COMPLETE, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3351 1599032481.05091: done building task lists 3351 1599032481.05102: counting tasks in each state of execution 3351 1599032481.05133: done counting tasks in each state of execution: num_setups: 0 num_tasks: 0 num_rescue: 0 num_always: 0 3351 1599032481.05166: all hosts are done, so returning None's for all hosts 3351 1599032481.05179: done queuing things up, now waiting for results queue to drain 3351 1599032481.05211: results queue empty 3351 1599032481.05238: checking for any_errors_fatal 3351 1599032481.05265: done checking for any_errors_fatal 3351 1599032481.05286: checking for max_fail_percentage 3351 1599032481.05317: done checking for max_fail_percentage 3351 1599032481.05336: checking to see if all hosts have failed and the running result is not ok 3351 1599032481.05359: done checking to see if all hosts have failed 3351 1599032481.05388: getting the next task for host localhost 3351 1599032481.05416: done getting next task for host localhost 3351 1599032481.05439: ^ task is: None 3351 1599032481.05468: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, run_state=ITERATING_COMPLETE, fail_state=FAILED_NONE, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 3351 1599032481.05499: running handlers

PLAY RECAP *** localhost : ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0

janr7 commented 4 years ago

Will close this issue as as BOSS controller can be configured by redfish module. Thank you.