Open lindhe opened 2 days ago
With some inspiration from https://github.com/ansible-collections/netapp.ontap/commit/56f44a9700b72dfa4d3b7122465b2910049be1d0 I figured out that setting use_rest: never
fixes the issue! So the problem is likely in the REST implementation.
@lindhe The error couldn't be reproduced in-house for the mentioned task as idempotency was observed as expected, with ONTAP 9.14.1 and Ansible collection 22.11.0.
- name: Create snapshot policy - REST
netapp.ontap.na_ontap_snapshot_policy:
use_rest: always
state: present
enabled: True
name: test_policy1
schedule:
- 'daily'
count:
- 2
register: result
Could you enable the REST API tracing and share ontap_apis.log content that shows the calls being sent to REST and the responses/errors received?
How strange. Maybe it works differently between ONTAP 9.12.1 and ONTAP 9.14.1?
I tried enabling the API trace, but I got no ontap_apis.log
in my current working directory. Is it located elsewhere or did I miss something?
EDIT: Yup, I missed the docs saying the file is located at /tmp/ontap_apis.log
. I'll have to look through it before sharing, so there's no secrets in it. I'll get back soon.
There, now I got it all in order, I believe. Here's the trace. I've just added a blank newline between the two runs.
2024-07-06 17:48:12,652 DEBUG sending: {'method': 'GET', 'url': 'https://192.168.0.1/api/cluster', 'verify': False, 'params': {'fields': ['version']}, 'timeout': 60, 'json': None, 'headers': 'redacted', 'auth_args': 'redacted'}
2024-07-06 17:48:12,653 DEBUG Starting new HTTPS connection (1): 192.168.0.1:443
2024-07-06 17:48:12,712 DEBUG https://192.168.0.1:443 "GET /api/cluster?fields=version HTTP/1.1" 200 168
2024-07-06 17:48:12,713 DEBUG 200: b'{\n "version": {\n "full": "NetApp Release 9.12.1P10: Wed Dec 20 00:24:59 UTC 2023",\n "generation": 9,\n "major": 12,\n "minor": 1\n },\n "_links": {\n "self": {\n "href": "/api/cluster"\n }\n }\n}'
2024-07-06 17:48:12,714 DEBUG sending: {'method': 'GET', 'url': 'https://192.168.0.1/api/storage/snapshot-policies', 'verify': False, 'params': {'name': 'ansible2', 'scope': 'cluster', 'fields': 'enabled,svm.uuid,comment,copies.snapmirror_label,copies.count,copies.prefix,copies.schedule.name,scope'}, 'timeout': 60, 'json': None, 'headers': 'redacted', 'auth_args': 'redacted'}
2024-07-06 17:48:12,715 DEBUG Starting new HTTPS connection (1): 192.168.0.1:443
2024-07-06 17:48:12,779 DEBUG https://192.168.0.1:443 "GET /api/storage/snapshot-policies?name=ansible2&scope=cluster&fields=enabled%2Csvm.uuid%2Ccomment%2Ccopies.snapmirror_label%2Ccopies.count%2Ccopies.prefix%2Ccopies.schedule.name%2Cscope HTTP/1.1" 200 198
2024-07-06 17:48:12,779 DEBUG 200: b'{\n "records": [\n ],\n "num_records": 0,\n "_links": {\n "self": {\n "href": "/api/storage/snapshot-policies?name=ansible2&scope=cluster&fields=enabled%2Csvm.uuid%2Ccomment%2Ccopies.snapmirror_label%2Ccopies.count%2Ccopies.prefix%2Ccopies.schedule.name%2Cscope"\n }\n }\n}'
2024-07-06 17:48:12,780 DEBUG sending: {'method': 'POST', 'url': 'https://192.168.0.1/api/storage/snapshot-policies', 'verify': False, 'params': {'return_timeout': 30}, 'timeout': 60, 'json': {'name': 'ansible2', 'enabled': True, 'copies': [{'schedule': {'name': 'daily'}, 'count': 2}]}, 'headers': 'redacted', 'auth_args': 'redacted'}
2024-07-06 17:48:12,781 DEBUG Starting new HTTPS connection (1): 192.168.0.1:443
2024-07-06 17:48:12,856 DEBUG https://192.168.0.1:443 "POST /api/storage/snapshot-policies?return_timeout=30 HTTP/1.1" 201 3
2024-07-06 17:48:12,856 DEBUG 201: b'{\n}'
2024-07-06 17:48:38,092 DEBUG sending: {'method': 'GET', 'url': 'https://192.168.0.1/api/cluster', 'verify': False, 'params': {'fields': ['version']}, 'timeout': 60, 'json': None, 'headers': 'redacted', 'auth_args': 'redacted'}
2024-07-06 17:48:38,093 DEBUG Starting new HTTPS connection (1): 192.168.0.1:443
2024-07-06 17:48:38,155 DEBUG https://192.168.0.1:443 "GET /api/cluster?fields=version HTTP/1.1" 200 168
2024-07-06 17:48:38,156 DEBUG 200: b'{\n "version": {\n "full": "NetApp Release 9.12.1P10: Wed Dec 20 00:24:59 UTC 2023",\n "generation": 9,\n "major": 12,\n "minor": 1\n },\n "_links": {\n "self": {\n "href": "/api/cluster"\n }\n }\n}'
2024-07-06 17:48:38,156 DEBUG sending: {'method': 'GET', 'url': 'https://192.168.0.1/api/storage/snapshot-policies', 'verify': False, 'params': {'name': 'ansible2', 'scope': 'cluster', 'fields': 'enabled,svm.uuid,comment,copies.snapmirror_label,copies.count,copies.prefix,copies.schedule.name,scope'}, 'timeout': 60, 'json': None, 'headers': 'redacted', 'auth_args': 'redacted'}
2024-07-06 17:48:38,157 DEBUG Starting new HTTPS connection (1): 192.168.0.1:443
2024-07-06 17:48:38,219 DEBUG https://192.168.0.1:443 "GET /api/storage/snapshot-policies?name=ansible2&scope=cluster&fields=enabled%2Csvm.uuid%2Ccomment%2Ccopies.snapmirror_label%2Ccopies.count%2Ccopies.prefix%2Ccopies.schedule.name%2Cscope HTTP/1.1" 200 198
2024-07-06 17:48:38,220 DEBUG 200: b'{\n "records": [\n ],\n "num_records": 0,\n "_links": {\n "self": {\n "href": "/api/storage/snapshot-policies?name=ansible2&scope=cluster&fields=enabled%2Csvm.uuid%2Ccomment%2Ccopies.snapmirror_label%2Ccopies.count%2Ccopies.prefix%2Ccopies.schedule.name%2Cscope"\n }\n }\n}'
2024-07-06 17:48:38,221 DEBUG sending: {'method': 'POST', 'url': 'https://192.168.0.1/api/storage/snapshot-policies', 'verify': False, 'params': {'return_timeout': 30}, 'timeout': 60, 'json': {'name': 'ansible2', 'enabled': True, 'copies': [{'schedule': {'name': 'daily'}, 'count': 2}]}, 'headers': 'redacted', 'auth_args': 'redacted'}
2024-07-06 17:48:38,222 DEBUG Starting new HTTPS connection (1): 192.168.0.1:443
2024-07-06 17:48:38,285 DEBUG https://192.168.0.1:443 "POST /api/storage/snapshot-policies?return_timeout=30 HTTP/1.1" 409 102
2024-07-06 17:48:38,286 DEBUG 409: b'{\n "error": {\n "message": "Policy name \\"ansible2\\" already exists. ",\n "code": "1638527"\n }\n}'
2024-07-06 17:48:38,286 ERROR 409: Endpoint error: 409: {'message': 'Policy name "ansible2" already exists. ', 'code': '1638527'}
I ran it using this command:
ansible-playbook --ask-vault-password -i ./inventory.yaml ./playbook.yaml
With these config files:
```yaml --- - name: Configure SVMs hosts: all connection: local tasks: - name: Create snapshot policy netapp.ontap.na_ontap_snapshot_policy: name: ansible2 schedule: - 'daily' count: - 2 enabled: true hostname: "{{ managementLIF }}" password: "{{ password }}" username: "{{ username }}" validate_certs: false use_rest: always feature_flags: trace_apis: true ```
```yaml
ungrouped:
hosts:
svm1:
managementLIF: 192.168.0.1
username: vsadmin
password:
Summary
It seems like
na_ontap_snapshot_policy
is not idempotent. When a snapshot policy already exists, the task fails rather than making changes or skipping the execution.Component Name
netapp.ontap.na_ontap_snapshot_policy
Ansible Version
ONTAP Collection Version
ONTAP Version
I don't have access to the NetApp cluster, so I cannot run the command. But I think we are running 9.12.1
Playbook
Steps to Reproduce
Create this
inventory.yaml
:Run the playbook twice:
Expected Results
Actual Results