lowRISC / opentitan

OpenTitan: Open source silicon root of trust
https://www.opentitan.org
Apache License 2.0
2.5k stars 745 forks source link

[test-triage] flaky rv_dm_stress_all_with_rand_reset test #24042

Closed nbdd0121 closed 1 week ago

nbdd0121 commented 2 months ago

Hierarchy of regression failure

Block level

Failure Description

UVM_ERROR (rv_dm_jtag_dmi_dm_inactive_vseq.sv:24) [rv_dm_jtag_dmi_dm_inactive_vseq] Check failed rdata == as_reg.get_reset() ( [] vs []) has 5 failures:

Test rv_dm_stress_all_with_rand_reset has 5 failures.

    0.rv_dm_stress_all_with_rand_reset.66975641602902106057314263067713562914841077465994968986660128468372726806578
    Line 268, in log /container/opentitan-public/scratch/os_regression/rv_dm-sim-vcs/0.rv_dm_stress_all_with_rand_reset/latest/run.log

      UVM_ERROR @ 1851933421 ps: (rv_dm_jtag_dmi_dm_inactive_vseq.sv:24) [uvm_test_top.env.virtual_sequencer.rv_dm_jtag_dmi_dm_inactive_vseq] Check failed rdata == as_reg.get_reset() (31 [0x1f] vs 0 [0x0])
      UVM_INFO @ 1851933421 ps: (uvm_report_catcher.svh:705) [UVM/REPORT/CATCHER]
      --- UVM Report catcher Summary ---

    2.rv_dm_stress_all_with_rand_reset.102753751660746842027780832671237996069892575362127291217035800174311628604811
    Line 257, in log /container/opentitan-public/scratch/os_regression/rv_dm-sim-vcs/2.rv_dm_stress_all_with_rand_reset/latest/run.log

      UVM_ERROR @ 1397913953 ps: (rv_dm_jtag_dmi_dm_inactive_vseq.sv:24) [uvm_test_top.env.virtual_sequencer.rv_dm_jtag_dmi_dm_inactive_vseq] Check failed rdata == as_reg.get_reset() (2280483741 [0x87ed6b9d] vs 0 [0x0])
      UVM_INFO @ 1397913953 ps: (uvm_report_catcher.svh:705) [UVM/REPORT/CATCHER]
      --- UVM Report catcher Summary ---

    ... and 3 more failures.

Steps to Reproduce

Tests with similar or related failures

No response

vogelpi commented 1 month ago

Thanks for filing the issue @nbdd0121 . This is a V3 test and thus I am giving it a lower priority than normal, plus moving it to M7. I am assigning @rswarbrick our RV_DM master for now :-)

rswarbrick commented 1 week ago

This is now passing for all seeds (except one seed in the last run, which I've just filed a PR to fix). I don't think we need to track things with this issue any more.