uyuni-project / uyuni

Source code for Uyuni
https://www.uyuni-project.org/
GNU General Public License v2.0
431 stars 180 forks source link

UI shows wrong System Status #7500

Open kurzandras opened 1 year ago

kurzandras commented 1 year ago

Hello!

I have an issue with the UI, it seems that sometimes it is displaying wrong informations about SLES systems.

The UI states that Software Updates Available [Critical: ] 1 [Non-Critical: ] 5 [Packages: ] 29

However if I check it on the system with zypper, it seems that there are no patches available:

sles12> zypper lp Repository-Daten werden geladen... Installierte Pakete werden gelesen... Keine Aktualisierungen gefunden.

If I check the upgradeable packages it displays different things too:

sles12> zypper lu Repository-Daten werden geladen... Installierte Pakete werden gelesen... S | Repository | Name | Aktuelle Version | Verfügbare Version | Arch --+-------------------------------------------------+----------------+------------------+--------------------+------- v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libSDL2-2_0-0 | 2.0.3-10.3 | 2.0.5-10.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libavcodec57 | 3.3.1-1.1 | 3.4.4-6.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libavdevice57 | 3.3.1-1.1 | 3.4.4-6.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libavfilter6 | 3.3.1-1.1 | 3.4.4-6.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libavformat57 | 3.3.1-1.1 | 3.4.4-6.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libavresample3 | 3.3.1-1.1 | 3.4.4-6.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libavutil55 | 3.3.1-1.1 | 3.4.4-6.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libhdf5-10 | 1.8.15-6.20 | 1.8.17-5.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libhdf5_hl10 | 1.8.15-6.20 | 1.8.17-5.1 | x86_64 v | SLE-SDK12-SP5-Pool for x86_64 | libid3tag0 | 0.15.1b-182.58 | 0.15.1b-184.3.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libpostproc54 | 3.3.1-1.1 | 3.4.4-6.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libswresample2 | 3.3.1-1.1 | 3.4.4-6.1 | x86_64 v | SUSE-PackageHub-12-SP5-Standard-Pool for x86_64 | libswscale4 | 3.3.1-1.1 | 3.4.4-6.1 | x86_64 v | SLE-SDK12-SP5-Pool for x86_64 | yasm | 1.2.0-8.101 | 1.2.0-10.1 | x86_64

On the Uyuni server I can see that "salt 'sles12' pkg.list_updates" shows the same results as zypper itself:

----------
libSDL2-2_0-0:
    2.0.5-10.1
libavcodec57:
    3.4.4-6.1
libavdevice57:
    3.4.4-6.1
libavfilter6:
    3.4.4-6.1
libavformat57:
    3.4.4-6.1
libavresample3:
    3.4.4-6.1
libavutil55:
    3.4.4-6.1
libhdf5-10:
    1.8.17-5.1
libhdf5_hl10:
    1.8.17-5.1
libid3tag0:
    0.15.1b-184.3.1
libpostproc54:
    3.4.4-6.1
libswresample2:
    3.4.4-6.1
libswscale4:
    3.4.4-6.1
yasm:
    1.2.0-10.1

If I try to upgrade the system through the UI, then the task will be picked up and completed successfully with the following:


      ID: sync_states
Function: saltutil.states
    Name: sync_states
  Result: true
 Comment: No updates to sync
 Started: 11:28:36.011087
Duration: 176.161
     SLS: util.syncstates
 Changed: {}

      ID: mgr_absent_ca_package
Function: pkg.removed
    Name: rhn-org-trusted-ssl-cert
  Result: true
 Comment: All specified packages are already absent
 Started: 11:28:36.997043
Duration: 23.357
     SLS: certs
 Changed: {}

      ID: mgr_ca_cert
Function: file.managed
    Name: /etc/pki/trust/anchors/RHN-ORG-TRUSTED-SSL-CERT
  Result: true
 Comment: File /etc/pki/trust/anchors/RHN-ORG-TRUSTED-SSL-CERT is in the correct state
 Started: 11:28:37.021629
Duration: 30.74
     SLS: certs
 Changed: {}

      ID: null
Function: cmd.run
    Name: null
  Result: true
 Comment: State was not run because none of the onchanges reqs changed
 Started: 11:28:37.053194
Duration: 0.003
     SLS: certs
 Changed: {}

      ID: mgr_proxy_ca_cert_symlink
Function: file.symlink
    Name: /usr/share/rhn/RHN-ORG-TRUSTED-SSL-CERT
  Result: true
 Comment: onlyif condition is false
 Started: 11:28:37.053253
Duration: 407.56
     SLS: certs
 Changed: {}

      ID: mgr_deploy_tools_uyuni_key
Function: file.managed
    Name: /etc/pki/rpm-gpg/uyuni-tools-gpg-pubkey-0d20833e.key
  Result: true
 Comment: File /etc/pki/rpm-gpg/uyuni-tools-gpg-pubkey-0d20833e.key is in the correct state
 Started: 11:28:37.460943
Duration: 14.949
     SLS: channels.gpg-keys
 Changed: {}

      ID: mgr_deploy_suse_addon_key
Function: file.managed
    Name: /etc/pki/rpm-gpg/suse-addon-97a636db0bad8ecc.key
  Result: true
 Comment: File /etc/pki/rpm-gpg/suse-addon-97a636db0bad8ecc.key is in the correct state
 Started: 11:28:37.476010
Duration: 14.185
     SLS: channels.gpg-keys
 Changed: {}

      ID: mgrchannels_repo
Function: file.managed
    Name: /etc/zypp/repos.d/susemanager:channels.repo
  Result: true
 Comment: File /etc/zypp/repos.d/susemanager:channels.repo is in the correct state
 Started: 11:28:37.490374
Duration: 86.509
     SLS: channels
 Changed: {}

      ID: mgrchannels_install_products
Function: product.installed
    Name: mgrchannels_install_products
  Result: true
 Comment: All subscribed products are already installed
 Started: 11:28:37.577336
Duration: 1102.724
     SLS: channels
 Changed: {}

      ID: mgrchannels_inst_suse_build_key
Function: pkg.installed
    Name: suse-build-key
  Result: true
 Comment: All specified packages are already installed
 Started: 11:28:38.680497
Duration: 7129.017
     SLS: channels
 Changed: {}

      ID: mgr_regular_patches
Function: pkg.installed
    Name: mgr_regular_patches
  Result: true
 Comment: **Advisory patch is not needed or related packages are already installed**
 Started: 11:28:45.809854
Duration: 12579.294
     SLS: packages.patchinstall
 Changed: {}

I have been desperately trying to solve this issue for a long time but I could not find anything useful anywhere, the log files seem to be okay, every other thing seems to be working fine. I have other systems which are not affected by this, but I do not know why, I could not find any underlying patterns. I have tried updating the software package list, and even reregister the host but they did not help.

Could you please point me to the right direction regarding troubleshooting?

If you need any additional informations please do not hesitate to ask me!

Thank you very much in advance!

Information for package Uyuni-Server-release:

Repository : uyuni-server-stable Name : Uyuni-Server-release Version : 2023.04-220400.204.2.uyuni2 Arch : x86_64 Vendor : obs://build.opensuse.org/systemsmanagement:Uyuni Support Level : Level 3 Installed Size : 1.4 KiB Installed : Yes (automatically) Status : up-to-date Source package : Uyuni-Server-release-2023.04-220400.204.2.uyuni2.src Summary : Uyuni Server Description : Uyuni lets you efficiently manage physical, virtual, and cloud-based Linux systems. It provides automated and cost-effective configuration and software management, asset management, and system provisioning.

dvosburg commented 1 year ago

What does the Events -> History show for this system?

kurzandras commented 1 year ago

Here it is:

image

dvosburg commented 1 year ago

You should realize that the updates you are looking at come from the Package Hub repository. For SLES, this is an unsupported collection of openSUSE packages presented in a channel consumable by SLES. They are likely owned by a different "vendor" (openSUSE) than the normal updates for SLES, and thus it is not truly considered an update in SUMA.

If you disable the PackageHub repo with zypper from the CLI, or un-assign it in SUSE Manager, does the problem go away? Do you really need PackageHub channels?

kurzandras commented 1 year ago

Thank you for your answer!

Yes, it would be nice to have the PackageHub repo, but I have removed the Package Hub channel from this system just to try this out:

sles12 > zypper lp Repository-Daten werden geladen... Installierte Pakete werden gelesen... Keine Aktualisierungen gefunden.

sles12 > zypper lu Repository-Daten werden geladen... Installierte Pakete werden gelesen... S | Repository | Name | Aktuelle Version | Verfügbare Version | Arch --+-------------------------------+------------+------------------+--------------------+------- v | SLE-SDK12-SP5-Pool for x86_64 | libid3tag0 | 0.15.1b-182.58 | 0.15.1b-184.3.1 | x86_64 v | SLE-SDK12-SP5-Pool for x86_64 | yasm | 1.2.0-8.101 | 1.2.0-10.1 | x86_64

Now the System UI shows the following:

Software Updates Available [Critical: ] 1 [Non-Critical: ] 5 [Packages: ] 9

The shown patches are from these channels just as before: SLES12-SP5-Updates for x86_64 SLE-SDK12-SP5-Updates for x86_64 SLE-Module-Legacy12-Updates for x86_64 SP5

I have tried to apply these patches, but still nothing happened on the system, the salt output is the same:

image

admd commented 1 year ago

Do you see errors like com.redhat.rhn.frontend.xmlrpc.InvalidErrataException: Invalid errata. in the logs when scheduling thee updates from the UI?

kurzandras commented 1 year ago

Hello!

I do not see anything like this at all. Here is the output of all the log files during the application of the "unnecessary" patches:

2023-09-15 15:37:44,081 [ajp-nio-0:0:0:0:0:0:0:1-8009-exec-6] INFO com.suse.manager.webui.controllers.FrontendLogController - [1 - Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36] - Loading https://myuyuniinstall.mydomain.test/rhn/systems/details/ErrataConfirm.do?allowVendorChange=false&sid=1000010059

==> rhn_taskomatic_daemon.log <== 2023-09-15 15:37:45,580 [DefaultQuartzScheduler_Worker-14] INFO com.redhat.rhn.taskomatic.task.MinionActionExecutor - Executing action: 923

==> rhn_web_frontend.log <== 2023-09-15 15:37:45,738 [ajp-nio-0:0:0:0:0:0:0:1-8009-exec-4] INFO com.suse.manager.webui.controllers.FrontendLogController - [1 - Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36] - Loading https://myuyuniinstall.mydomain.test/rhn/systems/details/ErrataList.do?sid=1000010059

==> rhn_taskomatic_daemon.log <== 2023-09-15 15:38:00,322 [DefaultQuartzScheduler_Worker-4] INFO com.redhat.rhn.taskomatic.task.SystemOverviewUpdateQueue - In the queue: 1 2023-09-15 15:38:00,598 [DefaultQuartzScheduler_Worker-13] INFO com.redhat.rhn.taskomatic.task.ErrataCacheTask - In the queue: 1

==> rhn_web_frontend.log <== 2023-09-15 15:38:37,594 [ajp-nio-0:0:0:0:0:0:0:1-8009-exec-7] INFO com.suse.manager.webui.controllers.FrontendLogController - [1 - Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36] - Leaving https://myuyuniinstall.mydomain.test/rhn/systems/details/ErrataList.do?sid=1000010059 2023-09-15 15:38:37,930 [ajp-nio-0:0:0:0:0:0:0:1-8009-exec-2] INFO com.suse.manager.webui.controllers.FrontendLogController - [1 - Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36] - Loading https://myuyuniinstall.mydomain.test/rhn/systems/details/ErrataList.do?sid=1000010059 2023-09-15 15:38:41,402 [ajp-nio-0:0:0:0:0:0:0:1-8009-exec-3] INFO com.suse.manager.webui.controllers.FrontendLogController - [1 - Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36] - Loading https://myuyuniinstall.mydomain.test/rhn/systems/details/history/Pending.do?sid=1000010059& 2023-09-15 15:38:43,617 [ajp-nio-0:0:0:0:0:0:0:1-8009-exec-10] INFO com.suse.manager.webui.controllers.FrontendLogController - [1 - Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36] - Loading https://myuyuniinstall.mydomain.test/rhn/systems/details/history/History.do?sid=1000010059& 2023-09-15 15:38:46,569 [ajp-nio-0:0:0:0:0:0:0:1-8009-exec-4] INFO com.suse.manager.webui.controllers.FrontendLogController - [1 - Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36] - Loading https://myuyuniinstall.mydomain.test/rhn/systems/details/packages/Packages.do?sid=1000010059&

==> rhn_taskomatic_daemon.log <== 2023-09-15 15:39:00,460 [DefaultQuartzScheduler_Worker-15] INFO com.redhat.rhn.taskomatic.task.SystemOverviewUpdateQueue - In the queue: 1 2023-09-15 15:39:00,710 [DefaultQuartzScheduler_Worker-19] INFO com.redhat.rhn.taskomatic.task.ErrataCacheTask - In the queue: 1

Thank you for your help!

admd commented 1 year ago

That doesn't tell much. Please monitor salt bus logs

salt-run state.event. pretty=True

when applying the patches from webUI and see what you get there. That should help finding the issue.

kurzandras commented 1 year ago

Thank you for your answer! Please have a look on the salt bus logs:

20230925064506777351 { "_stamp": "2023-09-25T06:45:06.787492", "minions": [ "sles12.mydomain.test" ] } salt/job/20230925064506777351/new { "_stamp": "2023-09-25T06:45:06.789749", "arg": [], "fun": "test.ping", "jid": "20230925064506777351", "minions": [ "sles12.mydomain.test" ], "missing": [], "tgt": [ "sles12.mydomain.test" ], "tgt_type": "list", "user": "admin" } salt/job/20230925064506777351/ret/sles12.mydomain.test { "_stamp": "2023-09-25T06:45:07.012746", "cmd": "_return", "fun": "test.ping", "fun_args": [], "id": "sles12.mydomain.test", "jid": "20230925064506777351", "metadata": { "batch-mode": true, "suma-action-chain": false, "suma-action-id": 928, "suma-force-pkg-list-refresh": false, "suma-minion-startup": false }, "retcode": 0, "return": true, "success": true } salt/batch/20230925064506777945/start { "_stamp": "2023-09-25T06:45:07.015155", "available_minions": [ "sles12.mydomain.test" ], "down_minions": [], "metadata": { "batch-mode": true, "suma-action-chain": false, "suma-action-id": 928, "suma-force-pkg-list-refresh": false, "suma-minion-startup": false } } 20230925064506777945 { "_stamp": "2023-09-25T06:45:07.018087", "minions": [ "sles12.mydomain.test" ] } salt/job/20230925064506777945/new { "_stamp": "2023-09-25T06:45:07.021723", "arg": [ { "kwarg": true, "mods": [ "packages.patchinstall" ], "pillar": { "allow_vendor_change": true, "param_regular_patches": [ "SUSE-SLE-Module-Legacy-12-2015-1018", "SUSE-SLE-Module-Legacy-12-2016-1120", "SUSE-SLE-Module-Legacy-12-2016-1503", "SUSE-SLE-SDK-12-SP5-2020-372", "SUSE-SLE-SDK-12-SP5-2021-3885", "SUSE-SLE-SDK-12-SP5-2023-3696", "SUSE-SLE-SERVER-12-SP5-2020-372", "SUSE-SLE-SERVER-12-SP5-2022-688", "SUSE-SLE-SERVER-12-SP5-2023-3552", "SUSE-SLE-SERVER-12-SP5-2023-3696" ], "param_update_stack_patches": [] }, "queue": true } ], "fun": "state.apply", "jid": "20230925064506777945", "minions": [ "sles12.mydomain.test" ], "missing": [], "tgt": [ "sles12.mydomain.test" ], "tgt_type": "list", "user": "salt" } minion/refresh/sles12.mydomain.test { "Minion data cache refresh": "sles12.mydomain.test", "_stamp": "2023-09-25T06:45:07.472239" } salt/job/20230925064506777945/ret/sles12.mydomain.test { "_stamp": "2023-09-25T06:45:36.497933", "cmd": "_return", "fun": "state.apply", "fun_args": [ { "mods": [ "packages.patchinstall" ], "pillar": { "allow_vendor_change": true, "param_regular_patches": [ "SUSE-SLE-Module-Legacy-12-2015-1018", "SUSE-SLE-Module-Legacy-12-2016-1120", "SUSE-SLE-Module-Legacy-12-2016-1503", "SUSE-SLE-SDK-12-SP5-2020-372", "SUSE-SLE-SDK-12-SP5-2021-3885", "SUSE-SLE-SDK-12-SP5-2023-3696", "SUSE-SLE-SERVER-12-SP5-2020-372", "SUSE-SLE-SERVER-12-SP5-2022-688", "SUSE-SLE-SERVER-12-SP5-2023-3552", "SUSE-SLE-SERVER-12-SP5-2023-3696" ], "param_update_stackpatches": [] }, "queue": true } ], "id": "sles12.mydomain.test", "jid": "20230925064506777945", "metadata": { "batch-mode": true, "suma-action-chain": false, "suma-action-id": 928, "suma-force-pkg-list-refresh": false, "suma-minion-startup": false }, "out": "highstate", "retcode": 0, "return": { "cmd|-update-ca-certificates|-/usr/sbin/update-ca-certificates|-run": { "run_num": 3, "sls": "certs", "state_ran": false, "changes": {}, "comment": "State was not run because none of the onchanges reqs changed", "duration": 0.003, "result": true, "starttime": "08:45:37.975571" }, "file|-mgr_cacert|-/etc/pki/trust/anchors/RHN-ORG-TRUSTED-SSL-CERT_|-managed": { "id": "mgr_ca_cert", "run_num": 2, "sls": "certs", "changes": {}, "comment": "File /etc/pki/trust/anchors/RHN-ORG-TRUSTED-SSL-CERT is in the correct state", "duration": 60.641, "name": "/etc/pki/trust/anchors/RHN-ORG-TRUSTED-SSL-CERT", "result": true, "starttime": "08:45:37.914006" }, "file|-mgr_deploy_suse_addonkey|-/etc/pki/rpm-gpg/suse-addon-97a636db0bad8ecc.key_|-managed": { "id": "mgr_deploy_suse_addon_key", "run_num": 6, "sls": "channels.gpg-keys", "changes": {}, "comment": "File /etc/pki/rpm-gpg/suse-addon-97a636db0bad8ecc.key is in the correct state", "duration": 15.749, "name": "/etc/pki/rpm-gpg/suse-addon-97a636db0bad8ecc.key", "result": true, "starttime": "08:45:38.713014" }, "file|-mgr_deploy_tools_uyunikey|-/etc/pki/rpm-gpg/uyuni-tools-gpg-pubkey-0d20833e.key_|-managed": { "id": "mgr_deploy_tools_uyuni_key", "run_num": 5, "sls": "channels.gpg-keys", "changes": {}, "comment": "File /etc/pki/rpm-gpg/uyuni-tools-gpg-pubkey-0d20833e.key is in the correct state", "duration": 33.533, "name": "/etc/pki/rpm-gpg/uyuni-tools-gpg-pubkey-0d20833e.key", "result": true, "starttime": "08:45:38.679361" }, "file|-mgr_proxy_ca_certsymlink|-/usr/share/rhn/RHN-ORG-TRUSTED-SSL-CERT_|-symlink": { "id": "mgr_proxy_ca_cert_symlink", "run_num": 4, "sls": "certs", "changes": {}, "comment": "onlyif condition is false", "duration": 703.599, "name": "/usr/share/rhn/RHN-ORG-TRUSTED-SSL-CERT", "result": true, "skip_watch": true, "starttime": "08:45:37.975632" }, "file|-mgrchannelsrepo|-/etc/zypp/repos.d/susemanager:channels.repo_|-managed": { "id": "mgrchannels_repo", "run_num": 7, "sls__": "channels", "changes": {}, "comment": "File /etc/zypp/repos.d/susemanager:channels.repo is in the correct state", "duration": 91.881, "name": "/etc/zypp/repos.d/susemanager:channels.repo", "result": true, "starttime": "08:45:38.728947" }, "pkg|-mgr_absent_capackage|-rhn-org-trusted-ssl-cert_|-removed": { "id": "mgr_absent_ca_package", "__run_num": 1, "sls": "certs", "changes": {}, "comment": "All specified packages are already absent", "duration": 23.401, "name": "rhn-org-trusted-ssl-cert", "result": true, "starttime": "08:45:37.872480" }, "pkg|-mgr_regularpatches|-mgr_regularpatches|-patch_installed": { "id": "mgr_regular_patches", "run_num": 10, "sls": "packages.patchinstall", "changes": {}, "comment": "Advisory patch is not needed or related packages are already installed", "duration": 13072.33, "name": "mgr_regular_patches", "result": true, "starttime": "08:45:49.287520" }, "pkg|-mgrchannels_inst_suse_buildkey|-suse-build-key_|-installed": { "id": "mgrchannels_inst_suse_build_key", "run_num": 9, "sls": "channels", "changes": {}, "comment": "All specified packages are already installed", "duration": 7683.128, "name": "suse-build-key", "result": true, "starttime": "08:45:41.604043" }, "product|-mgrchannels_installproducts|-mgrchannels_installproducts|-all_installed": { "id": "mgrchannels_install_products", "run_num": 8, "sls__": "channels", "changes": {}, "comment": "All subscribed products are already installed", "duration": 2782.29, "name": "mgrchannels_install_products", "result": true, "starttime": "08:45:38.821289" }, "saltutil|-syncstates|-syncstates|-sync_states": { "id": "sync_states", "__run_num": 0, "sls": "util.syncstates", "changes": {}, "comment": "No updates to sync", "duration": 186.767, "name": "sync_states", "result": true, "start_time": "08:45:35.563872" } }, "success": true } salt/batch/20230925064506777945/done { "_stamp": "2023-09-25T06:45:37.501424", "available_minions": [ "sles12.mydomain.test" ], "done_minions": [ "sles12.mydomain.test" ], "down_minions": [], "metadata": { "batch-mode": true, "suma-action-chain": false, "suma-action-id": 928, "suma-force-pkg-list-refresh": false, "suma-minion-startup": false }, "timedout_minions": [] }

If I try to reapply the patches again, the output is almost the same (except timestamps, ids ofc).

kurzandras commented 11 months ago

Hello! This is still not resolved, any help would be greatly appreciated! Thank you in advance!