osism / issues

This repository is used for bug reports that are cross-project or not bound to a specific repository (or to an unknown repository).
https://www.osism.tech
1 stars 1 forks source link

Volume attachment is not deleted when deleting instance #537

Open Nils98Ar opened 1 year ago

Nils98Ar commented 1 year ago

Reproduce:

Rebuilding seems to be affected as well.

Error: Unable to retrieve attachment information. [Details](https://xyz.abc/project/instances/#message_details)

Instance d94b2a7d-ca3e-4d98-812b-a9c8543e1163 could not be found. (HTTP 404) (Request-ID: req-8a31474c-5765-4f5f-b51f-da88b882d3e9)

Can you reproduce this?

Nils98Ar commented 1 year ago

Error in cinder-api.log:


2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault [req-7a9d2d2b-4303-4b00-9607-9e78b47a3600 req-dec1c0ff-1935-422a-8ff5-18dcbc41882b 3bbd834cd1cf4415a62e20636c0e532c 4b5570be3d9f4eabb31fc65d646453b2 - - default default] Caught error: <class 'cinder.exception.ConflictNovaUsingAttachment'> Detach volume from instance d94b2a7d-ca3e-4d98-812b-a9c8543
e1163 using the Compute API: cinder.exception.ConflictNovaUsingAttachment: Detach volume from instance d94b2a7d-ca3e-4d98-812b-a9c8543e1163 using the Compute API
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault Traceback (most recent call last):
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/cinder/api/middleware/fault.py", line 84, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return req.get_response(self.application)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/request.py", line 1313, in send
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     status, headers, app_iter = self.call_application(
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/request.py", line 1278, in call_application
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     app_iter = application(self.environ, start_response)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 143, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return resp(environ, start_response)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 129, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     resp = self.call_func(req, *args, **kw)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 193, in call_func
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return self.func(req, *args, **kwargs)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/osprofiler/web.py", line 111, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return request.get_response(self.application)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/request.py", line 1313, in send
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     status, headers, app_iter = self.call_application(
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/request.py", line 1278, in call_application
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     app_iter = application(self.environ, start_response)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 129, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     resp = self.call_func(req, *args, **kw)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 193, in call_func
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return self.func(req, *args, **kwargs)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/keystonemiddleware/auth_token/__init__.py", line 341, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     response = req.get_response(self._app)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/request.py", line 1313, in send
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     status, headers, app_iter = self.call_application(
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/request.py", line 1278, in call_application
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     app_iter = application(self.environ, start_response)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 143, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return resp(environ, start_response)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 143, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return resp(environ, start_response)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/routes/middleware.py", line 153, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     response = self.app(environ, start_response)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 143, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return resp(environ, start_response)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 129, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     resp = self.call_func(req, *args, **kw)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/webob/dec.py", line 193, in call_func
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return self.func(req, *args, **kwargs)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/cinder/api/openstack/wsgi.py", line 839, in __call__
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return self._process_stack(request, action, action_args,
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/cinder/api/openstack/wsgi.py", line 900, in _process_stack
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     action_result = self.dispatch(meth, request, action_args)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/cinder/api/openstack/wsgi.py", line 995, in dispatch
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return method(req=request, **action_args)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/cinder/api/openstack/wsgi.py", line 1160, in version_select
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     return func.func(self, *args, **kwargs)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/cinder/api/v3/attachments.py", line 282, in delete
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     attachments = self.volume_api.attachment_delete(context, attachment)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/cinder/volume/api.py", line 2625, in attachment_delete
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     self.attachment_deletion_allowed(ctxt, attachment)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault   File "/var/lib/kolla/venv/lib/python3.10/site-packages/cinder/volume/api.py", line 2616, in attachment_deletion_allowed
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault     raise exception.ConflictNovaUsingAttachment(instance_id=server_id)
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault cinder.exception.ConflictNovaUsingAttachment: Detach volume from instance d94b2a7d-ca3e-4d98-812b-a9c8543e1163 using the Compute API
2023-05-31 14:09:18.056 23 ERROR cinder.api.middleware.fault
2023-05-31 14:09:18.057 23 INFO cinder.api.middleware.fault [req-7a9d2d2b-4303-4b00-9607-9e78b47a3600 req-dec1c0ff-1935-422a-8ff5-18dcbc41882b 3bbd834cd1cf4415a62e20636c0e532c 4b5570be3d9f4eabb31fc65d646453b2 - - default default] http://api-int.xyz:8776/v3/4b5570be3d9f4eabb31fc65d646453b2/attachments/8dd7761b-ba15-454c-b45a-12e4483f51f0 returned with HTTP 409```
Nils98Ar commented 1 year ago

Maybe related to:

artificial-intelligence commented 1 year ago

So, I tried to reproduce this via Horizon, but failed so far:

I created a test network (this should not matter, but you can't launch an instance via horizon without some kind of network).

Then I created a new instance via Horizon with the following parameters:

The Volume spec, according to Horizon is:

After the instance was successfully launched I deleted it via Horizon. I could not observe any error in Horizon.

I then looked at the volumes status, the status is:

@Nils98Ar could you please exactly specify which steps you took to "Create instance with volume" because there are multiple ways to do so via the gui, e.g. did you manually create a volume first with which options, did you specify the "delete volume on instance deletation flag", did you create the instance with boot from volume or from image? did you use any ephemeral storage etc.?

Thanks!

PS: I also tried to reproduce this via the openstack cli, but had the same results.

artificial-intelligence commented 1 year ago

additional question for @Nils98Ar : in which environment did this error occur, so which osism release version is deployed?

Nils98Ar commented 1 year ago

@artificial-intelligence

OSISM Version is 5.1.0. OpenStack and Ceph versions are not defined explicitly so Zed and Pacific (16.2.11) is used. Labels of the used quay.io/osism/cinder-api:zed image:

"Labels": {
    "build-date": "20230512",
    "de.osism.commit.docker_images_kolla": "d2a54c4",
    "de.osism.commit.kolla": "5ae0803da",
    "de.osism.commit.kolla_version": "15.1.1",
    "de.osism.commit.release": "b2f50f6",
    "de.osism.release.openstack": "zed",
    "de.osism.version": "latest",
    "kolla_version": "zed",
    "name": "cinder-api",
    "org.opencontainers.image.created": "2023-05-12T00:15:13.562362+00:00",
    "org.opencontainers.image.documentation": "https://docs.osism.tech",
    "org.opencontainers.image.licenses": "ASL 2.0",
    "org.opencontainers.image.ref.name": "ubuntu",
    "org.opencontainers.image.source": "https://github.com/osism/container-images-kolla",
    "org.opencontainers.image.title": "cinder-api",
    "org.opencontainers.image.url": "https://www.osism.tech",
    "org.opencontainers.image.vendor": "OSISM GmbH",
    "org.opencontainers.image.version": "latest"
}

Let's do it via cli.

Create instance:

dragon@manager:~$ openstack server create test2 --image "Debian 11" --flavor "SCS-1V-4-10" --network "CLOUD-INT" --boot-from-volume 10
+-------------------------------------+----------------------------------------------------+
| Field                               | Value                                              |
+-------------------------------------+----------------------------------------------------+
| OS-DCF:diskConfig                   | MANUAL                                             |
| OS-EXT-AZ:availability_zone         |                                                    |
| OS-EXT-SRV-ATTR:host                | None                                               |
| OS-EXT-SRV-ATTR:hypervisor_hostname | None                                               |
| OS-EXT-SRV-ATTR:instance_name       |                                                    |
| OS-EXT-STS:power_state              | NOSTATE                                            |
| OS-EXT-STS:task_state               | scheduling                                         |
| OS-EXT-STS:vm_state                 | building                                           |
| OS-SRV-USG:launched_at              | None                                               |
| OS-SRV-USG:terminated_at            | None                                               |
| accessIPv4                          |                                                    |
| accessIPv6                          |                                                    |
| addresses                           |                                                    |
| adminPass                           | JeesJVuBT5w3                                       |
| config_drive                        |                                                    |
| created                             | 2023-06-01T09:23:29Z                               |
| flavor                              | SCS-1V-4-10 (1b4243b5-e497-4c5d-b825-7d3ca6b00d8f) |
| hostId                              |                                                    |
| id                                  | ee7255f1-3489-4407-8d77-390633b84058               |
| image                               | N/A (booted from volume)                           |
| key_name                            | None                                               |
| name                                | test2                                              |
| progress                            | 0                                                  |
| project_id                          | e5bb574f8f8d4c72bc216801b6cd77de                   |
| properties                          |                                                    |
| security_groups                     | name='default'                                     |
| status                              | BUILD                                              |
| updated                             | 2023-06-01T09:23:29Z                               |
| user_id                             | 19cef1e468cf4089873b262f5e58c629                   |
| volumes_attached                    |                                                    |
+-------------------------------------+----------------------------------------------------+

Display volume with attachment:

dragon@manager:~$ openstack volume show f50507a7-35e4-40af-a70f-21810366edd8 -c status -c id -c attachments
+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Field       | Value                                                                                                                                                                                                                                                                                                           |
+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| attachments | [{'id': 'f50507a7-35e4-40af-a70f-21810366edd8', 'attachment_id': '76d03416-d433-4cb0-af9a-ce4f49ef5131', 'volume_id': 'f50507a7-35e4-40af-a70f-21810366edd8', 'server_id': 'ee7255f1-3489-4407-8d77-390633b84058', 'host_name': 'compute03', 'device': '/dev/sda', 'attached_at': '2023-06-01T09:23:42.000000'}] |
| id          | f50507a7-35e4-40af-a70f-21810366edd8                                                                                                                                                                                                                                                                            |
| status      | in-use                                                                                                                                                                                                                                                                                                          |
+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

Delete instance: (At the same time the above error appears in the cinder-api.log)

openstack server delete delete ee7255f1-3489-4407-8d77-390633b84058

Again display volume with attachment:

dragon@manager:~$ openstack volume show f50507a7-35e4-40af-a70f-21810366edd8 -c status -c id -c attachments
+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Field       | Value                                                                                                                                                                                                                                                                                                           |
+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| attachments | [{'id': 'f50507a7-35e4-40af-a70f-21810366edd8', 'attachment_id': '76d03416-d433-4cb0-af9a-ce4f49ef5131', 'volume_id': 'f50507a7-35e4-40af-a70f-21810366edd8', 'server_id': 'ee7255f1-3489-4407-8d77-390633b84058', 'host_name': 'compute03', 'device': '/dev/sda', 'attached_at': '2023-06-01T09:23:42.000000'}] |
| id          | f50507a7-35e4-40af-a70f-21810366edd8                                                                                                                                                                                                                                                                            |
| status      | in-use                                                                                                                                                                                                                                                                                                          |
+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

Try to show server from attachment:

dragon@manager:~$ openstack server show ee7255f1-3489-4407-8d77-390633b84058
No Server found for ee7255f1-3489-4407-8d77-390633b84058
artificial-intelligence commented 1 year ago

thanks for the steps to reproduce this, my test environment reacts like this:

openstack server create test2 --image "Debian 11" --flavor "SCS-1V-4-10" --boot-from-volume 10
+-----------------------------+----------------------------------------------------+
| Field                       | Value                                              |
+-----------------------------+----------------------------------------------------+
| OS-DCF:diskConfig           | MANUAL                                             |
| OS-EXT-AZ:availability_zone |                                                    |
| OS-EXT-STS:power_state      | NOSTATE                                            |
| OS-EXT-STS:task_state       | scheduling                                         |
| OS-EXT-STS:vm_state         | building                                           |
| OS-SRV-USG:launched_at      | None                                               |
| OS-SRV-USG:terminated_at    | None                                               |
| accessIPv4                  |                                                    |
| accessIPv6                  |                                                    |
| addresses                   |                                                    |
| adminPass                   | Ut4PE7qtV8oE                                       |
| config_drive                |                                                    |
| created                     | 2023-06-01T09:51:35Z                               |
| flavor                      | SCS-1V-4-10 (7c97be77-7ed3-488a-918f-749172cad50f) |
| hostId                      |                                                    |
| id                          | 21405473-7fca-4cc1-8b87-9c014af235e2               |
| image                       | N/A (booted from volume)                           |
| key_name                    | None                                               |
| name                        | test2                                              |
| progress                    | 0                                                  |
| project_id                  | 0e0cf814f17b494b8c3c5ccedfcf4bf8                   |
| properties                  |                                                    |
| security_groups             | name='default'                                     |
| status                      | BUILD                                              |
| updated                     | 2023-06-01T09:51:35Z                               |
| user_id                     | c28b1e9a7c2f42cbb39b92a469cafac3                   |
| volumes_attached            |                                                    |
+-----------------------------+----------------------------------------------------+

display the volume:

openstack volume show 96a56d57-95d0-43b1-9701-f80d16942bc1 -c status -c id -c attachments                                                      ✔  at 11:54:38 
+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Field       | Value                                                                                                                                                                                                                                                                                                     |
+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| attachments | [{'id': '96a56d57-95d0-43b1-9701-f80d16942bc1', 'attachment_id': 'e060f79e-6ece-4e9e-8916-730d101bceb8', 'volume_id': '96a56d57-95d0-43b1-9701-f80d16942bc1', 'server_id': '21405473-7fca-4cc1-8b87-9c014af235e2', 'host_name': None, 'device': '/dev/sda', 'attached_at': '2023-06-01T09:52:32.000000'}] |
| id          | 96a56d57-95d0-43b1-9701-f80d16942bc1                                                                                                                                                                                                                                                                      |
| status      | in-use                                                                                                                                                                                                                                                                                                    |
+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

delete instance:

openstack server delete test2

display instance:

openstack --os-cloud regiocloud1 server show test2                                                                                                                    ✔  at 11:55:50 
No Server found for test2

display volume:

openstack volume show 96a56d57-95d0-43b1-9701-f80d16942bc1 -c status -c id -c attachments                                                    1 ✘  at 11:55:59 
+-------------+--------------------------------------+
| Field       | Value                                |
+-------------+--------------------------------------+
| attachments | []                                   |
| id          | 96a56d57-95d0-43b1-9701-f80d16942bc1 |
| status      | available                            |
+-------------+--------------------------------------+

the fix for https://bugs.launchpad.net/nova/+bug/2004555 is not rolled out in this environment yet, afaik. I'll investigate there first.

Nils98Ar commented 1 year ago

@artificial-intelligence Could you share the labels of your used cinder-api image?

artificial-intelligence commented 1 year ago

this env is on cinder-api:21.1.1.20230407

which contains the following:

   "build-date": "20230407",
                "de.osism.commit.docker_images_kolla": "09e02f8",
                "de.osism.commit.kolla": "40deae888",
                "de.osism.commit.kolla_version": "15.1.1",
                "de.osism.commit.release": "443378a",
                "de.osism.release.openstack": "zed",
                "de.osism.service.version": "21.1.1",
                "de.osism.version": "5.1.0",
Nils98Ar commented 1 year ago

Where do I see the de.osism.service.version label?

I am surprised that we have "de.osism.version": "latest" in the cinder-api image labels although we have manager_version: 5.1.0 set in environments/manager/configuration.yml.

Nils98Ar commented 1 year ago

@ppkuschy @berendt Apparently we are using the image https://quay.io/repository/osism/cinder-api with tag zed. Is that the right one? Cinder-api just as example...

berendt commented 1 year ago

Those are the rolling tags. We rebuild them every night, based on the stable upstream branches. When using a stable release of OSISM you should not use the rolling tags.

Nils98Ar commented 1 year ago

@berendt Any idea why those are used although we have configured manager_version: 5.1.0?

berendt commented 1 year ago

https://release.osism.tech:

if openstack_version or ceph_version are set in environments/manager/configuration.yml (or anywhere else), they must be removed when using a stable release
Nils98Ar commented 1 year ago

@berendt

Both openstack_version and ceph_version have been remove since the upgrade.

I did two MANAGER_VERSION=5.1.0 gilt overlay and a osism apply facts && osism apply configuration && osism reconciler sync before running osism apply --action upgrade cinder again (also tried reconfigure, without action and deleting the cinder-api container before).

But it still uses the zed tag...

berendt commented 1 year ago

Can you please share the output of docker compose -f /opt/manager/docker-compose.yml ps.

Nils98Ar commented 1 year ago

I did a osism-update-manager, docker compose -f /opt/manager/docker-compose.yml up -d and then osism apply cinder again.

This seems to work and I will do this for the other OpenStack services too. Just one last hint: with action upgrade or without? After that I will check the instance delete/ volume attachment behaviour again.

For completeness:

dragon@adm09901:/opt/configuration$ docker compose -f /opt/manager/docker-compose.yml ps
NAME                             IMAGE                                         COMMAND                  SERVICE                CREATED             STATUS                   PORTS
ceph-ansible                     quay.io/osism/ceph-ansible:5.1.0              "/entrypoint.sh osis…"   ceph-ansible           7 minutes ago       Up 6 minutes (healthy)
kolla-ansible                    quay.io/osism/kolla-ansible:5.1.0             "/entrypoint.sh osis…"   kolla-ansible          7 minutes ago       Up 6 minutes (healthy)
manager-api-1                    quay.io/osism/osism:0.20230407.0              "osism service api"      api                    2 weeks ago         Up 2 weeks (healthy)     10.99.2.51:8000->8000/tcp
manager-ara-server-1             quay.io/osism/ara-server:1.6.1                "sh -c '/wait && /ru…"   ara-server             7 minutes ago       Up 6 minutes (healthy)   10.99.2.51:8120->8000/tcp
manager-beat-1                   quay.io/osism/osism:0.20230407.0              "osism service beat"     beat                   2 weeks ago         Up 2 weeks (healthy)
manager-conductor-1              quay.io/osism/osism:0.20230407.0              "osism worker conduc…"   conductor              2 weeks ago         Up 2 weeks
manager-inventory_reconciler-1   quay.io/osism/inventory-reconciler:5.1.0      "/sbin/tini -- /entr…"   inventory_reconciler   2 weeks ago         Up 2 weeks (healthy)
manager-listener-1               quay.io/osism/osism:0.20230407.0              "osism service liste…"   listener               2 weeks ago         Up 2 weeks
manager-mariadb-1                index.docker.io/library/mariadb:10.11.3       "docker-entrypoint.s…"   mariadb                7 minutes ago       Up 7 minutes (healthy)   3306/tcp
manager-netbox-1                 quay.io/osism/osism-netbox:0.20230407.0       "osism worker netbox"    netbox                 2 weeks ago         Up 2 weeks
manager-openstack-1              quay.io/osism/osism:0.20230407.0              "osism worker openst…"   openstack              2 weeks ago         Up 2 weeks
manager-redis-1                  index.docker.io/library/redis:7.0.11-alpine   "docker-entrypoint.s…"   redis                  2 weeks ago         Up 2 weeks (healthy)     6379/tcp
manager-watchdog-1               quay.io/osism/osism:0.20230407.0              "osism service watch…"   watchdog               2 weeks ago         Up 2 weeks
osism-ansible                    quay.io/osism/osism-ansible:5.1.0             "/entrypoint.sh osis…"   osism-ansible          7 minutes ago       Up 6 minutes (healthy)
osismclient                      quay.io/osism/osism:0.20230407.0              "sleep infinity"         osismclient            7 minutes ago       Up 7 minutes
berendt commented 1 year ago

Use osism apply -a pull FOOBAR followed by osism apply -a upgrade FOOBAR. This way you enforce the pull action.

Nils98Ar commented 1 year ago

@berendt I've switched all OpenStack containers to version-tags instead of rolling tags. @ppkuschy Now the behaviour is as you have described and the attachment is deleted properly.

I think the problem is that the openstack_version was still configured in the past when we updated the manager.

At least now we know that volume detachment is broken in the rolling tag images :)

berendt commented 1 year ago

We'll add a test for attach/detach to the testbed.

fkr commented 1 year ago

thanks @Nils98Ar and @artificial-intelligence for the debugging!

berendt commented 8 months ago

Attach is done in the test play of the openstack environment:

    - name: Attach test volume
      openstack.cloud.server_volume:
        cloud: test
        state: present
        server: test
        volume: test