Open DanielOsypenko opened 8 months ago
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 30 days if no further activity occurs.
frequent issue when ocs-storagecluster label mixed with rack labels https://reportportal-ocs4.apps.ocp-c1.prod.psi.redhat.com/ui/#ocs/launches/678/22636/1084612/1084618/log
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 30 days if no further activity occurs.
although test has stabilized with 70% passed, we still have issues
E selenium.common.exceptions.ElementClickInterceptedException: Message: element click intercepted: Element ... is not clickable at point (2084, 1475). Other element would receive the click:
E (Session info: chrome=129.0.6668.100)
locator = ("(//[@class='pf-topology__node__label']//[contains(text(), 'j-087vi1cs33-t3-lldwn-worker-0-6wbkf')]/parent::/parent::/parent::/parent:://*[@class='pf-topology__node__decorator'])[2]", 'xpath')
...
tests/cross_functional/ui/test_odf_topology.py:119:
ocs_ci/ocs/ui/page_objects/odf_topology_tab.py:783: in validate_topology_configuration
deployment_topology = self.nodes_view.nav_into_node(
ocs_ci/utility/retry.py:31: in f_retry
return f(args, *kwargs)
ocs_ci/ocs/ui/page_objects/odf_topology_tab.py:969: in nav_into_node
self.do_click(loc, 60, True)
rook-ceph-crashcollector-j-086vi1cs33-t3-7rnt2-worker-0-5qjqf is present on UI but not present on CLI something with labels No opened bugs, and it is not a bug RP - https://url.corp.redhat.com/48849ee
[2024-09-18T18:21:50.438Z] 14:21:50 - MainThread - ocs_ci.ocs.ui.base_ui - [31m[1mERROR[0m - deployments of the node 'j-086vi1cs33-t3-7rnt2-worker-0-5qjqf' from UI do not match deployments from CLI
[2024-09-18T18:21:50.438Z] deployments_list_cli = '['csi-rbdplugin-provisioner', 'noobaa-operator', 'rook-ceph-exporter-j-086vi1cs33-t3-7rnt2-worker-0-5qjqf', 'rook-ceph-mds-ocs-storagecluster-cephfilesystem-b', 'rook-ceph-mgr-b', 'rook-ceph-mon-b', 'rook-ceph-osd-0', 'rook-ceph-tools']'
[2024-09-18T18:21:50.438Z] deployments_list_ui = '['csi-rbdplugin-provisioner', 'noobaa-operator', 'rook-ceph-crashcollector-j-086vi1cs33-t3-7rnt2-worker-0-5qjqf', 'rook-ceph-exporter-j-086vi1cs33-t3-7rnt2-worker-0-5qjqf', 'rook-ceph-mds-ocs-storagecluster-cephfilesystem-b', 'rook-ceph-mgr-b', 'rook-ceph-mon-b', 'rook-ceph-osd-0', 'rook-ceph-tools']'
[2024-09-18T18:21:53.689Z] 14:21:53 - MainThread - ocs_ci.ocs.ui.base_ui - [32mINFO[0m - Copy DOM file: /home/jenkins/current-cluster-dir/logs/ui_logs_dir_1726682042/dom/test_validate_topology_configuration/2024-09-18T14-21-53.068364_DOM.txt
[2024-09-18T18:21:53.944Z] 14:21:53 - MainThread - ocs_ci.ocs.ui.base_ui - [31m[1mERROR[0m - deployments of the node 'j-086vi1cs33-t3-7rnt2-worker-0-695t5' from UI do not match deployments from CLI
[2024-09-18T18:21:53.944Z] deployments_list_cli = '['busybox-ui-test', 'csi-cephfsplugin-provisioner', 'ocs-metrics-exporter', 'ocs-operator', 'rook-ceph-exporter-j-086vi1cs33-t3-7rnt2-worker-0-695t5', 'rook-ceph-mgr-a', 'rook-ceph-mon-a', 'rook-ceph-operator', 'rook-ceph-osd-1', 'rook-ceph-rgw-ocs-storagecluster-cephobjectstore-a', 'ux-backend-server']'
[2024-09-18T18:21:53.944Z] deployments_list_ui = '['busybox-ui-test', 'csi-cephfsplugin-provisioner', 'ocs-metrics-exporter', 'ocs-operator', 'rook-ceph-crashcollector-j-086vi1cs33-t3-7rnt2-worker-0-695t5', 'rook-ceph-exporter-j-086vi1cs33-t3-7rnt2-worker-0-695t5', 'rook-ceph-mgr-a', 'rook-ceph-mon-a', 'rook-ceph-operator', 'rook-ceph-osd-1', 'rook-ceph-rgw-ocs-storagecluster-cephobjectstore-a', 'ux-backend-server']'
https://reportportal-ocs4.apps.ocp-c1.prod.psi.redhat.com/ui/#ocs/launches/632/18393/894276/894283/log
@tier3 @bugzilla("2209251") @bugzilla("2233027") @polarion_id("OCS-4901") def test_validate_topology_configuration( self, setup_ui_class, teardown_depl_busybox, ): """ Test to validate configuration of ODF Topology for internal and external deployments, cloud based deployments and on-prem deployments also for post-upgrade scenarios.
E Failed: got deviation in topology configuration, at least one check failed E {'worker_group_labels_not_equal': True}