Closed liamwh closed 3 years ago
Thanks for pointing this out. We have added this is in our backlog and will provide the fix in subsequent releases
Hi @liamwh , Since you have created server profile from server profile template, can you please share your server profile template payload with us so that it will help us to find the root cause of the issue.
Hi @liamwh , Since you have created server profile from server profile template, can you please share your server profile template payload with us so that it will help us to find the root cause of the issue.
Hi Venkatesh, can you please clarify what you mean by a Server Profile payload? The Server Profile and Server Profile Template both exist in OneView.
Hi @liamwh, idempotency on server profile may misbehave sometimes when an extra configuration is added to server profile (like adding customized os deployment plan, adding local storage and bios settings, etc).
But when I looked into the playbook you shared, it seems very simple and no extra configuration is present in server profile
to break the idempotency. So I doubt that you might have added some extra configuration on server profile template
itself and you have used that template(serverProfileTemplateName
) while profile creation which resulted in breaking the idempotency. So I want to check your server profile template
configuration (which we also call it as payload) and will try to reproduce the same configuration in our testing setup and will try to fix the library files accordingly.
Thanks Venkatesh, I will provide the requested details tomorrow
Hi Venkatesh, the management card.server_profile_template value is "DL380 Gen9 1GB-Nic Disabled", without the quotation marks.
The Server Profile Template facts can be found in JSON below, with some values removed for security reasons:
{ "ansible_facts": { "server_profile_templates": [ { "affinity": null, "bios": { "complianceControl": "Checked", "manageBios": true, "overriddenSettings": [ { "id": "EmbNicEnable", "value": "Disabled" }, { "id": "FlexLom1Enable", "value": "Disabled" }, { "id": "PowerRegulator", "value": "OsControl" }, { "id": "PowerProfile", "value": "Custom" } ] }, "boot": { "complianceControl": "Checked", "manageBoot": true, "order": [ "CD", "HardDisk", "USB", "PXE" ] }, "bootMode": { "complianceControl": "Checked", "manageMode": true, "mode": "BIOS", "pxeBootPolicy": null, "secureBoot": "Unmanaged" }, "category": "server-profile-templates", "connectionSettings": { "complianceControl": "Unchecked", "connections": [], "manageConnections": false }, "created": "2021-02-09T14:28:47.450Z", "description": "DL380 Gen9 Included Disks", "eTag": "1615204548506/7", "enclosureGroupUri": null, "firmware": { "complianceControl": "Checked", "firmwareActivationType": "Immediate", "firmwareBaselineUri": "/rest/firmware-drivers/P35935_001_spp-2020_09_0-SPP2020090_2020_0901_114", "firmwareInstallType": "FirmwareOnlyOfflineMode", "forceInstallFirmware": false, "manageFirmware": true }, "hideUnusedFlexNics": null, "iscsiInitiatorNameType": "AutoGenerated", "localStorage": { "complianceControl": "Checked", "controllers": [ { "deviceSlot": "Embedded", "driveWriteCache": "Unmanaged", "initialize": true, "logicalDrives": [ { "accelerator": "Unmanaged", "bootable": true, "driveTechnology": "SataSsd", "name": "Local", "numPhysicalDrives": 2, "numSpareDrives": null, "raidLevel": "RAID1", "sasLogicalJBODId": null } ], "mode": "RAID", "predictiveSpareRebuild": "Unmanaged" } ], "sasLogicalJBODs": [] }, "macType": "Physical", "managementProcessor": { "complianceControl": "Checked", "manageMp": true, "mpSettings": [ { "args": { "notes": "REMOVED FOR OBSCURITY" }, "settingType": "Directory" }, { "args": { "directoryGroupAccounts": [ {"notes": "REMOVED FOR OBSCURITY"} ] }, "settingType": "DirectoryGroups" } ] }, "modified": "2021-03-08T11:55:48.506Z", "name": "DL380 Gen9 1GB-Nic Disabled", "osDeploymentSettings": null, "refreshState": "NotRefreshing", "sanStorage": { "complianceControl": "Unchecked", "manageSanStorage": false, "sanSystemCredentials": [], "volumeAttachments": [] }, "scopesUri": "/rest/scopes/resources/rest/server-profile-templates/87f49ba3-ca1a-4446-a5a0-899342b3b438", "serialNumberType": "Physical", "serverHardwareTypeUri": "/rest/server-hardware-types/DCC3BC07-B540-4F74-9986-8BEAC4AB0817", "serverProfileDescription": "DL380 Gen9", "state": null, "status": "OK", "type": "ServerProfileTemplateV8", "uri": "/rest/server-profile-templates/87f49ba3-ca1a-4446-a5a0-899342b3b438", "wwnType": "Physical" } ] }, "changed": false }
@VenkateshRavula Tagging for visibility.
Thanks @liamwh for your response.
I have created a server profile template with similar configuration in my environment and I re-run the server profile playbook multiple times. The idempotency worked perfectly for me.
I doubt that the description parameter is causing the idempotency issue for you since you concatenates {{ ansible_date_time.date }}
to it which generates a variable dynamically whenever you run the playbook on 2 different dates (should not hit the issue when you run the playbook multiple times on the same day though).
Please help me with your inputs in understanding this issue much better.
description
parameter in your playbook and check if you still face the idempotency issue.Thanks @VenkateshRavula.
Using this:
Still causes the idempotency issue seen here:
Hi @liamwh, I have tried all the scenarios as per your inputs using the playbook you provided. But I didn't face idempotency issue anywhere. And I am not sure whether the latest ansible library changes are getting invoked in your environment or not. Can you try your playbook using our docker image to make sure there are no environment/setup related issues. Pull the latest 6.0 image if you want to give a try. https://hub.docker.com/r/hewlettpackardenterprise/hpe-oneview-sdk-for-ansible
I have used the below playbook in my environment and it worked perfectly for me in all my executions.
---
- hosts: localhost
gather_facts: true
tasks:
- name: Gather server hardware facts
oneview_server_hardware_facts:
hostname: "{{ oneview_ip }}"
username: "{{ oneview_username }}"
password: "{{ oneview_password }}"
auth_login_domain: "LOCAL"
api_version: 2200
name: "{{ server_name }}"
delegate_to: localhost
register: hardware_facts
- name: Create server from server profile template
oneview_server_profile:
hostname: "{{ oneview_ip }}"
username: "{{ oneview_username }}"
password: "{{ oneview_password }}"
auth_login_domain: "LOCAL"
api_version: 2200
data:
name: "sp-from-spt"
description: "added by ansible"
serverHardwareUri: "{{ hardware_facts.ansible_facts.server_hardwares.uri }}"
serverProfileTemplateName: 'SPT-test'
delegate_to: localhost
Hi Venkatish,
I get the module error shown below:
Using the below playbook:
Hi @liamwh , I am not getting the module errors when I tried playbook execution in docker environment. Below are the steps I have followed.
- Pulled the hpe-oneview-ansible docker image
$ docker pull hewlettpackardenterprise/hpe-oneview-sdk-for-ansible:v6.0.0-OV6.0
- Run the docker container and attached my ansible repository to docker as volume
$ docker run -it -v <repo_path>:/root/oneview-ansible --name ov-ansible-60 -t hewlettpackardenterprise/hpe-oneview-sdk-for-ansible:v6.0.0-OV6.0 /bin/bash
- Then it landed me inside docker in ~/oneview-ansible path (same as shown in your screenshot)
- Then I simply executed my playbook and it ran successfully.
The only change you did is changing to /mnt/
directory and executed your playbook which also should work.
Since there is no update from you, we are closing this issue assuming the issue is resolved. Please feel free to reopen.
Scenario/Intent
Trying to achieve idempotent deployment of HPE servers with accurate OK/Changed output from Ansible.
Environment Details
Steps to Reproduce
x2
Expected Result
Status OK returned from Ansible, not Changed.
Actual Result