rposudnevskiy / RBDSR

RBDSR - XenServer/XCP-ng Storage Manager plugin for CEPH
GNU Lesser General Public License v2.1
58 stars 23 forks source link

SR scan issue with ceph xapi plugin #10

Closed Emmenemoi closed 8 years ago

Emmenemoi commented 8 years ago

Delete a VM with snapshot:

 vdi_delete {'sr_uuid': '2C740292-44FC-40F6-9949-XXX', 'subtask_of': 'DummyRef:|0e2fd846-9404-2214-59a1-XXX|VDI.destroy', 'vdi_ref': 'OpaqueRef:de7f938d-55f4-d634-c669-XXX', 'vdi_on_boot': 'persist', 'args': [], 'vdi_location': '33dd263e-934f-4094-8794-XXX', 'host_ref': 'OpaqueRef:a3894b6e-409a-3cc3-0132-XXX', 'session_ref': 'OpaqueRef:8d28593e-933b-d7a4-7363-XXX', 'device_config': {'SRmaster': 'true'}, 'command': 'vdi_delete', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:f63541e3-eb71-1619-100d-XXX', 'vdi_uuid': '33dd263e-934f-4094-8794-XXX'}
Aug  8 10:52:14 xenserver-test SM: [14343] RBDVDI.delete for 33dd263e-934f-4094-8794-XXX
Aug  8 10:52:14 xenserver-test SM: [14343] Pause request for 9b0ccb0b-faa7-4b40-9409-XXX
Aug  8 10:52:14 xenserver-test SM: [14343] Calling _unmap_VHD
Aug  8 10:52:14 xenserver-test SM: [14343] Calling ceph_plugin
Aug  8 10:52:14 xenserver-test SM: [14343] ***** generic exception: vdi_delete: EXCEPTION <type 'exceptions.NameError'>, global name 'session' is not defined

vdi_delete {'sr_uuid': '2C740292-44FC-40F6-9949-XXX', 'subtask_of': 'DummyRef:|22c664dd-5e79-6741-de40-XXX|VDI.destroy', 'vdi_ref': 'OpaqueRef:7a12072b-c57a-e7a1-3d43-XXX', 'vdi_on_boot': 'persist', 'args': [], 'vdi_location': '9b0ccb0b-faa7-4b40-9409-XXX', 'host_ref': 'OpaqueRef:a3894b6e-409a-3cc3-0132-XXX', 'session_ref': 'OpaqueRef:97e967c8-5e88-3d8c-72af-XXX', 'device_config': {'SRmaster': 'true'}, 'command': 'vdi_delete', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:f63541e3-eb71-1619-100d-XXX', 'vdi_uuid': '9b0ccb0b-faa7-4b40-9409-XXX'}
Aug  8 10:54:12 xenserver-test SM: [14999] RBDVDI.delete for 9b0ccb0b-faa7-4b40-9409-XXX
Aug  8 10:54:12 xenserver-test SM: [14999] ['uuidgen', '-r']
Aug  8 10:54:12 xenserver-test SM: [14999]   pread SUCCESS
Aug  8 10:54:12 xenserver-test SM: [14999] ['rbd', 'mv', 'RBD_XenStorage-2C740292-44FC-40F6-9949-XXX/VHD-9b0ccb0b-faa7-4b40-9409-XXX', 'RBD_XenStorage-2C740292-44FC-40F6-9949-XXX/VHD-52ce7013-fb9f-4e87-af3a-XXX', '--name', 'client.xenserver']
Aug  8 10:54:13 xenserver-test SM: [14999]   pread SUCCESS
Aug  8 10:54:13 xenserver-test SM: [14999] RBDVDI.delete set snapshot_of = 9b0ccb0b-faa7-4b40-9409-XXX for 33dd263e-934f-4094-8794-XXX

Then SM scan fails: sr_scan: EXCEPTION <class 'XenAPI.Failure'>, ['UUID_INVALID', 'VDI', '52ce7013-fb9f-4e87-af3a-XXX']

Because the RBD image is a snap (meta: SNAP-33dd263e-934f-4094-8794-XXX) of a deleted VDI.

rposudnevskiy commented 8 years ago

Fixed. Please check.

mhoffmann75 commented 8 years ago

First of all your update does fix VM cloning - great! Unfortunately your fix does not fix all of the issues of the original Delete VM with snapshot:

It seems to leave behind a disk (maybe the snapshot disk?), although i marked the snapshot to be deleted aswell. But no visible errors so far.

However scanning the SR fails because the VHD in ceph is deleted - which is ok. But Citrix still thinks it exists:

Aug 9 11:26:52 pns-xen07 SM: [29998] sr_scan {'sr_uuid': 'ff12160f-ff09-40bb-a874-1366ad907f44', 'subtask_of': 'DummyRef:|9dddedcb-ba74-05ac-b306-9b636cd30033|SR.scan', 'args': [], 'host_ref': 'OpaqueRef:0e9c18cb-c243-2d9e-b4db-7bb854e066df', 'session_ref': 'OpaqueRef:d0d95870-58be-dc79-8f0d-4063195b2842', 'device_config': {'SRmaster': 'true'}, 'command': 'sr_scan', 'sr_ref': 'OpaqueRef:eab55273-a6e8-6963-3264-7f2df1efd9f5'} Aug 9 11:26:52 pns-xen07 SM: [29998] ['ceph', 'df', '--format', 'json', '--name', 'client.admin'] Aug 9 11:26:52 pns-xen07 SM: [29998] preit SUCCESS Aug 9 11:26:52 pns-xen07 SM: [29998] ['rbd', 'ls', '-l', '--format', 'json', '--pool', 'RBD_XenStorage-ff12160f-ff09-40bb-a874-1366ad907f44', '--name', 'client.admin'] Aug 9 11:26:52 pns-xen07 SM: [29998] preit SUCCESS Aug 9 11:26:52 pns-xen07 SM: [29998] ['rbd', 'ls', '-l', '--format', 'json', '--pool', 'RBD_XenStorage-ff12160f-ff09-40bb-a874-1366ad907f44', '--name', 'client.admin'] Aug 9 11:26:52 pns-xen07 SM: [29998] preit SUCCESS Aug 9 11:26:52 pns-xen07 SM: [29998] ['rbd', 'image-meta', 'list', u'VHD-46e962e3-e116-4734-b977-973e90f5ba17', '--pool', 'RBD_XenStorage-ff12160f-ff09-40bb-a874-1366ad907f44', '--format', 'json', '--name', 'client.admin'] Aug 9 11:26:52 pns-xen07 SM: [29998] preit SUCCESS Aug 9 11:26:53 pns-xen07 SM: [30159] ['ceph', 'df', '--format', 'json', '--name', 'client.admin'] Aug 9 11:26:53 pns-xen07 SM: [30159] preit SUCCESS Aug 9 11:26:53 pns-xen07 SM: [30159] vdi_update {'sr_uuid': 'ff12160f-ff09-40bb-a874-1366ad907f44', 'subtask_of': 'DummyRef:|c9b5d1ad-0161-00a0-6692-0e8c53697142|VDI.stat', 'vdi_ref': 'OpaqueRef:4ced5a8a-5c4f-cda2-9b43-80e3376f89ce', 'vdi_on_boot': 'persist', 'args': [], 'vdi_location': '46e962e3-e116-4734-b977-973e90f5ba17', 'host_ref': 'OpaqueRef:0e9c18cb-c243-2d9e-b4db-7bb854e066df', 'session_ref': 'OpaqueRef:9b2ac079-6be9-629f-2ae0-d3b057d66e7c', 'device_config': {'SRmaster': 'true'}, 'command': 'vdi_update', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:eab55273-a6e8-6963-3264-7f2df1efd9f5', 'vdi_uuid': '46e962e3-e116-4734-b977-973e90f5ba17'} Aug 9 11:26:53 pns-xen07 SM: [30159] RBDSR.update for 46e962e3-e116-4734-b977-973e90f5ba17 Aug 9 11:26:53 pns-xen07 SM: [30159] ['rbd', 'image-meta', 'set', 'VHD-46e962e3-e116-4734-b977-973e90f5ba17', 'VDI_LABEL', 'Win2012R2 0', '--pool', 'RBD_XenStorage-ff12160f-ff09-40bb-a874-1366ad907f44', '--name', 'client.admin'] Aug 9 11:26:53 pns-xen07 SM: [30159] preit SUCCESS Aug 9 11:26:53 pns-xen07 SM: [30159] ['rbd', 'image-meta', 'set', 'VHD-46e962e3-e116-4734-b977-973e90f5ba17', 'VDI_DESCRIPTION', 'Created by template provisioner', '--pool', 'RBD_XenStorage-ff12160f-ff09-40bb-a874-1366ad907f44', '--name', 'client.admin'] Aug 9 11:26:53 pns-xen07 SM: [30159] preit SUCCESS Aug 9 11:26:53 pns-xen07 SM: [30159] ['rbd', 'image-meta', 'set', 'VHD-46e962e3-e116-4734-b977-973e90f5ba17', 'SNAP-d434857f-e438-4ee0-a287-a83de4b00e84', '19700101T00:00:00Z', '--pool', 'RBD_XenStorage-ff12160f-ff09-40bb-a874-1366ad907f44', '--name', 'client.admin'] Aug 9 11:26:53 pns-xen07 SM: [30159] preit SUCCESS Aug 9 11:26:53 pns-xen07 SM: [30159] RBDVDI.update start setting snapshots Aug 9 11:26:53 pns-xen07 SM: [30289] ['ceph', 'df', '--format', 'json', '--name', 'client.admin'] Aug 9 11:26:54 pns-xen07 SM: [30289] preit SUCCESS Aug 9 11:26:54 pns-xen07 SM: [30289] vdi_update {'sr_uuid': 'ff12160f-ff09-40bb-a874-1366ad907f44', 'subtask_of': 'DummyRef:|8f9e1082-90d9-e264-2a3a-71b4c6118ca3|VDI.stat', 'vdi_ref': 'OpaqueRef:31d8a7e9-31be-7d48-8689-f034f1bb1b86', 'vdi_on_boot': 'persist', 'args': [], 'vdi_location': 'd434857f-e438-4ee0-a287-a83de4b00e84', 'host_ref': 'OpaqueRef:0e9c18cb-c243-2d9e-b4db-7bb854e066df', 'session_ref': 'OpaqueRef:4c22a80a-84e7-af0a-bb1f-3987d278c6ba', 'device_config': {'SRmaster': 'true'}, 'command': 'vdi_update', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:eab55273-a6e8-6963-3264-7f2df1efd9f5', 'vdi_uuid': 'd434857f-e438-4ee0-a287-a83de4b00e84'} Aug 9 11:26:54 pns-xen07 SM: [30289] RBDSR.update for d434857f-e438-4ee0-a287-a83de4b00e84 Aug 9 11:26:54 pns-xen07 SM: [30289] ['rbd', 'image-meta', 'set', 'VHD-d434857f-e438-4ee0-a287-a83de4b00e84', 'VDI_LABEL', 'Win2012R2 0', '--pool', 'RBD_XenStorage-ff12160f-ff09-40bb-a874-1366ad907f44', '--name', 'client.admin'] Aug 9 11:26:54 pns-xen07 SM: [30289] FAILED in util.pread: (rc 2) stdout: '', stderr: 'rbd: error opening image VHD-d434857f-e438-4ee0-a287-a83de4b00e84: (2) No such file or directory Aug 9 11:26:54 pns-xen07 SM: [30289] ' Aug 9 11:26:54 pns-xen07 SM: [30289] * vdi_update: EXCEPTION <class 'util.CommandException'>, No such file or directory Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/SRCommand.py", line 110, in run Aug 9 11:26:54 pns-xen07 SM: [30289] return self._run_locked(sr) Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/SRCommand.py", line 159, in _run_locked Aug 9 11:26:54 pns-xen07 SM: [30289] rv = self._run(sr, target) Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/SRCommand.py", line 230, in _run Aug 9 11:26:54 pns-xen07 SM: [30289] return target.update(self.params['sr_uuid'], self.vdi_uuid) Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/RBDSR", line 603, in update Aug 9 11:26:54 pns-xen07 SM: [30289] cephutils.VDI.update(self, sr_uuid, vdi_uuid) Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/cephutils.py", line 246, in update Aug 9 11:26:54 pns-xen07 SM: [30289] util.pread2(["rbd", "image-meta", "set", vdi_name, "VDI_LABEL", self.label, "--pool", self.sr.CEPH_POOL_NAME, "--name", self.sr.CEPH_USER]) Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/util.py", line 189, in pread2 Aug 9 11:26:54 pns-xen07 SM: [30289] return pread(cmdlist, quiet = quiet) Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/util.py", line 182, in preit Aug 9 11:26:54 pns-xen07 SM: [30289] raise CommandException(rc, str(cmdlist), stderr.strip()) Aug 9 11:26:54 pns-xen07 SM: [30289] Aug 9 11:26:54 pns-xen07 SM: [30289] Raising exception [202, General backend error [opterr=Command ['rbd', 'image-meta', 'set', 'VHD-d434857f-e438-4ee0-a287-a83de4b00e84', 'VDI_LABEL', 'Win2012R2 0', '--pool', 'RBD_XenStorage-ff12160f-ff09-40bb-a874-1366ad907f44', '--name', 'client.admin'] failed (rbd: error opening image VHD-d434857f-e438-4ee0-a287-a83de4b00e84: (2) No such file or directory): No such file or directory]] Aug 9 11:26:54 pns-xen07 SM: [30289] * RBD: EXCEPTION <class 'SR.SROSError'>, General backend error [opterr=Command ['rbd', 'image-meta', 'set', 'VHD-d434857f-e438-4ee0-a287-a83de4b00e84', 'VDI_LABEL', 'Win2012R2 0', '--pool', 'RBD_XenStorage-ff12160f-ff09-40bb-a874-1366ad907f44', '--name', 'client.admin'] failed (rbd: error opening image VHD-d434857f-e438-4ee0-a287-a83de4b00e84: (2) No such file or directory): No such file or directory] Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/SRCommand.py", line 352, in run Aug 9 11:26:54 pns-xen07 SM: [30289] ret = cmd.run(sr) Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/SRCommand.py", line 120, in run Aug 9 11:26:54 pns-xen07 SM: [30289] raise xs_errors.XenError(excType, opterr=msg) Aug 9 11:26:54 pns-xen07 SM: [30289] File "/opt/xensource/sm/xs_errors.py", line 52, in init Aug 9 11:26:54 pns-xen07 SM: [30289] raise SR.SROSError(errorcode, errormessage) Aug 9 11:26:54 pns-xen07 SM: [30289] Aug 9 11:26:54 pns-xen07 SM: [30159] *\ vdi_update: EXCEPTION <class 'XenAPI.Failure'>, ['INTERNAL_ERROR', 'Storage_interface.Vdi_does_not_exist("d434857f-e438-4ee0-a287-a83de4b00e84")'] Aug 9 11:26:54 pns-xen07 SM: [30159] File "/opt/xensource/sm/SRCommand.py", line 110, in run Aug 9 11:26:54 pns-xen07 SM: [30159] return self._run_locked(sr) Aug 9 11:26:54 pns-xen07 SM: [30159] File "/opt/xensource/sm/SRCommand.py", line 159, in _run_locked Aug 9 11:26:54 pns-xen07 SM: [30159] rv = self._run(sr, target) Aug 9 11:26:54 pns-xen07 SM: [30159] File "/opt/xensource/sm/SRCommand.py", line 230, in _run Aug 9 11:26:54 pns-xen07 SM: [30159] return target.update(self.params['sr_uuid'], self.vdi_uuid) Aug 9 11:26:54 pns-xen07 SM: [30159] File "/opt/xensource/sm/RBDSR", line 615, in update Aug 9 11:26:54 pns-xen07 SM: [30159] self.session.xenapi.VDI.set_name_label(snapshot_vdi_ref, self.session.xenapi.VDI.get_name_label(self_vdi_ref)) Aug 9 11:26:54 pns-xen07 SM: [30159] File "/usr/lib/python2.7/site-packages/XenAPI.py", line 248, in call Aug 9 11:26:54 pns-xen07 SM: [30159] return self.send(self.name, args) Aug 9 11:26:54 pns-xen07 SM: [30159] File "/usr/lib/python2.7/site-packages/XenAPI.py", line 150, in xenapi_request Aug 9 11:26:54 pns-xen07 SM: [30159] result = _parse_result(getattr(self, methodname)(_full_params)) Aug 9 11:26:54 pns-xen07 SM: [30159] File "/usr/lib/python2.7/site-packages/XenAPI.py", line 222, in _parse_result Aug 9 11:26:54 pns-xen07 SM: [30159] raise Failure(result['ErrorDescription']) Aug 9 11:26:54 pns-xen07 SM: [30159] Aug 9 11:26:54 pns-xen07 SM: [30159] Raising exception [202, General backend error [opterr=['INTERNAL_ERROR', 'Storage_interface.Vdi_does_notexist("d434857f-e438-4ee0-a287-a83de4b00e84")']]] Aug 9 11:26:54 pns-xen07 SM: [30159] **\ RBD: EXCEPTION <class 'SR.SROSError'>, General backend error [opterr=['INTERNAL_ERROR', 'Storage_interface.Vdi_does_not_exist("d434857f-e438-4ee0-a287-a83de4b00e84")']] Aug 9 11:26:54 pns-xen07 SM: [30159] File "/opt/xensource/sm/SRCommand.py", line 352, in run Aug 9 11:26:54 pns-xen07 SM: [30159] ret = cmd.run(sr) Aug 9 11:26:54 pns-xen07 SM: [30159] File "/opt/xensource/sm/SRCommand.py", line 120, in run Aug 9 11:26:54 pns-xen07 SM: [30159] raise xs_errors.XenError(excType, opterr=msg) Aug 9 11:26:54 pns-xen07 SM: [30159] File "/opt/xensource/sm/xs_errors.py", line 52, in init Aug 9 11:26:54 pns-xen07 SM: [30159] raise SR.SROSError(errorcode, errormessage) Aug 9 11:26:54 pns-xen07 SM: [30159] Aug 9 11:26:54 pns-xen07 SM: [29998] *** sr_scan: EXCEPTION <class 'XenAPI.Failure'>, ['INTERNAL_ERROR', 'Storage_interface.Vdi_does_not_exist("46e962e3-e116-4734-b977-973e90f5ba17")'] Aug 9 11:26:54 pns-xen07 SM: [29998] File "/opt/xensource/sm/SRCommand.py", line 110, in run Aug 9 11:26:54 pns-xen07 SM: [29998] return self._run_locked(sr) Aug 9 11:26:54 pns-xen07 SM: [29998] File "/opt/xensource/sm/SRCommand.py", line 159, in _run_locked Aug 9 11:26:54 pns-xen07 SM: [29998] rv = self._run(sr, target) Aug 9 11:26:54 pns-xen07 SM: [29998] File "/opt/xensource/sm/SRCommand.py", line 338, in _run Aug 9 11:26:54 pns-xen07 SM: [29998] return sr.scan(self.params['sr_uuid']) Aug 9 11:26:54 pns-xen07 SM: [29998] File "/opt/xensource/sm/RBDSR", line 223, in scan Aug 9 11:26:54 pns-xen07 SM: [29998] self._loadvdis() Aug 9 11:26:54 pns-xen07 SM: [29998] File "/opt/xensource/sm/RBDSR", line 159, in _loadvdis Aug 9 11:26:54 pns-xen07 SM: [29998] self.session.xenapi.VDI.set_name_description(vdi_ref, description) Aug 9 11:26:54 pns-xen07 SM: [29998] File "/usr/lib/python2.7/site-packages/XenAPI.py", line 248, in call Aug 9 11:26:54 pns-xen07 SM: [29998] return self.send(self.name, args) Aug 9 11:26:54 pns-xen07 SM: [29998] File "/usr/lib/python2.7/site-packages/XenAPI.py", line 150, in xenapi_request Aug 9 11:26:54 pns-xen07 SM: [29998] result = _parse_result(getattr(self, methodname)(_full_params)) Aug 9 11:26:54 pns-xen07 SM: [29998] File "/usr/lib/python2.7/site-packages/XenAPI.py", line 222, in _parse_result Aug 9 11:26:54 pns-xen07 SM: [29998] raise Failure(result['ErrorDescription']) Aug 9 11:26:54 pns-xen07 SM: [29998] Aug 9 11:26:54 pns-xen07 SM: [29998] Raising exception [40, The SR scan failed [opterr=['INTERNAL_ERROR', 'Storage_interface.Vdi_does_notexist("46e962e3-e116-4734-b977-973e90f5ba17")']]] Aug 9 11:26:54 pns-xen07 SM: [29998] **\ RBD: EXCEPTION <class 'SR.SROSError'>, The SR scan failed [opterr=['INTERNAL_ERROR', 'Storage_interface.Vdi_does_not_exist("46e962e3-e116-4734-b977-973e90f5ba17")']] Aug 9 11:26:54 pns-xen07 SM: [29998] File "/opt/xensource/sm/SRCommand.py", line 352, in run Aug 9 11:26:54 pns-xen07 SM: [29998] ret = cmd.run(sr) Aug 9 11:26:54 pns-xen07 SM: [29998] File "/opt/xensource/sm/SRCommand.py", line 120, in run Aug 9 11:26:54 pns-xen07 SM: [29998] raise xs_errors.XenError(excType, opterr=msg) Aug 9 11:26:54 pns-xen07 SM: [29998] File "/opt/xensource/sm/xs_errors.py", line 52, in init Aug 9 11:26:54 pns-xen07 SM: [29998] raise SR.SROSError(errorcode, errormessage) Aug 9 11:26:54 pns-xen07 SM: [29998]

mhoffmann75 commented 8 years ago

Sorry forgot to mention: This only seems to happen when i clone a VM, then take a snapshot and afterwards delete the VM including all Snapshots.

Normal fresh created VM with snapshot can be deleted fine now. So your fix is working. But in special case for a fast cloned VM it's broken.

rposudnevskiy commented 8 years ago

Fixed Please check it again.

Emmenemoi commented 8 years ago

Can't delete the detected snapshot VDI (not attached):

Aug  9 19:39:48 xenserver-test SM: [12778] vdi_delete {'sr_uuid': '2C740292-44FC-40F6-9949-1F68E59B9024', 'subtask_of': 'DummyRef:|6d354656-8fa8-c3a0-efc5-0d2541be7679|VDI.destroy', 'vdi_ref': 'OpaqueRef:1240f633-4438-4faf-cc6c-5be4a02a84a7', 'vdi_on_boot': 'persist', 'args': [], 'vdi_location': '33dd263e-934f-4094-8794-0ec024870835', 'host_ref': 'OpaqueRef:a3894b6e-409a-3cc3-0132-53cbb4803b6a', 'session_ref': 'OpaqueRef:c3d3b813-7933-e5ba-09fc-54bf39d2531c', 'device_config': {'rbd-mode': 'nbd', 'cephx-id': 'xenserver', 'SRmaster': 'true'}, 'command': 'vdi_delete', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:717833f2-b3c8-a0da-b5a3-abbe26981e47', 'vdi_uuid': '33dd263e-934f-4094-8794-0ec024870835'}
Aug  9 19:39:48 xenserver-test SM: [12778] RBDVDI.delete for 33dd263e-934f-4094-8794-0ec024870835
Aug  9 19:39:48 xenserver-test SM: [12778] Pause request for afba13c0-9906-459f-9b5d-3d9d9f3acf2a
Aug  9 19:39:48 xenserver-test SM: [12778] ***** vdi_delete: EXCEPTION <class 'XenAPI.Failure'>, ['MAP_DUPLICATE_KEY', 'VDI', 'sm_config', 'OpaqueRef:82153e6e-58e1-ee6f-4926-b69391a33650', 'paused']
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/opt/xensource/sm/SRCommand.py", line 110, in run
Aug  9 19:39:48 xenserver-test SM: [12778]     return self._run_locked(sr)
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/opt/xensource/sm/SRCommand.py", line 159, in _run_locked
Aug  9 19:39:48 xenserver-test SM: [12778]     rv = self._run(sr, target)
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/opt/xensource/sm/SRCommand.py", line 237, in _run
Aug  9 19:39:48 xenserver-test SM: [12778]     return target.delete(self.params['sr_uuid'], self.vdi_uuid)
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/opt/xensource/sm/RBDSR", line 332, in delete
Aug  9 19:39:48 xenserver-test SM: [12778]     self._delete_snapshot(self_sm_config["snapshot-of"], vdi_uuid)
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/opt/xensource/sm/cephutils.py", line 275, in _delete_snapshot
Aug  9 19:39:48 xenserver-test SM: [12778]     if not blktap2.VDI.tap_pause(self.session, self.sr.uuid, vdi_uuid):
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/opt/xensource/sm/blktap2.py", line 1350, in tap_pause
Aug  9 19:39:48 xenserver-test SM: [12778]     session.xenapi.VDI.add_to_sm_config(vdi_ref, 'paused', 'true')
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/usr/lib/python2.7/site-packages/XenAPI.py", line 248, in __call__
Aug  9 19:39:48 xenserver-test SM: [12778]     return self.__send(self.__name, args)
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/usr/lib/python2.7/site-packages/XenAPI.py", line 150, in xenapi_request
Aug  9 19:39:48 xenserver-test SM: [12778]     result = _parse_result(getattr(self, methodname)(*full_params))
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/usr/lib/python2.7/site-packages/XenAPI.py", line 222, in _parse_result
Aug  9 19:39:48 xenserver-test SM: [12778]     raise Failure(result['ErrorDescription'])
Aug  9 19:39:48 xenserver-test SM: [12778]
Aug  9 19:39:48 xenserver-test SM: [12778] Raising exception [80, Failed to mark VDI hidden [opterr=['MAP_DUPLICATE_KEY', 'VDI', 'sm_config', 'OpaqueRef:82153e6e-58e1-ee6f-4926-b69391a33650', 'paused']]]
Aug  9 19:39:48 xenserver-test SM: [12778] ***** RBD: EXCEPTION <class 'SR.SROSError'>, Failed to mark VDI hidden [opterr=['MAP_DUPLICATE_KEY', 'VDI', 'sm_config', 'OpaqueRef:82153e6e-58e1-ee6f-4926-b69391a33650', 'paused']]
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/opt/xensource/sm/SRCommand.py", line 352, in run
Aug  9 19:39:48 xenserver-test SM: [12778]     ret = cmd.run(sr)
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/opt/xensource/sm/SRCommand.py", line 120, in run
Aug  9 19:39:48 xenserver-test SM: [12778]     raise xs_errors.XenError(excType, opterr=msg)
Aug  9 19:39:48 xenserver-test SM: [12778]   File "/opt/xensource/sm/xs_errors.py", line 52, in __init__
Aug  9 19:39:48 xenserver-test SM: [12778]     raise SR.SROSError(errorcode, errormessage)
rposudnevskiy commented 8 years ago

I guess you receive this error due to wrong attached attribute in sm_config due ti previous error. Try this: xe vdi-forget uuid=33dd263e-934f-4094-8794-0ec024870835 xe sr-scan uuid=2C740292-44FC-40F6-9949-1F68E59B9024 Now try to delete the VDI

Emmenemoi commented 8 years ago

correct. sorry.