OSVVM / OSVVM-Scripts

OSVVM project simulation scripts. Scripts are tedious. These scripts simplify the steps to compile your project for simulation
Other
10 stars 15 forks source link

Passing attributes to simulation run scripts #32

Closed aakulinsk closed 2 years ago

aakulinsk commented 2 years ago

Hello!

Is there a way to run simulation builds scripts (.pro) with attributes? To be able to flexibly pass for example: simulation time or some other testbench generic? It would be also nice to be able to change the name of waveform source file.

Thanks!

JimLewis commented 2 years ago

By attributes, do you mean simulation time, testbench generics, and waveform files?

These are great questions for the OSVVM forum. See: https://osvvm.org/forums/forum/os-vvm

I will answer this in my blog post on OSVVM.org - shortly. I will cover the topics I listed above. If you would like other topics, please respond here.

Did you look at OsvvmLibraries/Documentation/Script_user_guide.pdf?

aakulinsk commented 2 years ago

To be more precise. By passing attributes I mean running the script with additional arguments. For modelsim scripting I could do it like this:

do run_simulation.do wave_top_wrapper.do 1us

Where part of these arguments where passed to the testbench through generic and some used only inside the run_simulation.do script:

if { $argc == 0 } {
  echo "| Haven't set any simulation parameters                                               |"
  set generics " "
} elseif { $argc == 1 } {
  echo "| Waveforms files were set:                                                            |"
  set wave_file_list $1
  set generics " "
} elseif { $argc == 2 } {
  echo "| Simulation time and waveforms files were set:                                    |"
  set wave_file_list $1
  set sim_time $2
  set generics -gSIM_TIME=$sim_time
}
vsim {*}$generics -c work.tb_top_wrapper -t ns

# Set waveforms
echo "Setting waveforms: "
foreach wave $wave_file_list {
  echo $wave
  do $wave
}

I am trying to reproduce similar behavior in .pro scripts. I've went through the the documentation and yes, I've seen that there is a way to pass generics to a testbench. Unfortunately, I haven't seen a way to pass arguments while running the script.

In general, the possibility to change testbench generics and waveform source files while running the compilation script, I find very comfortable and useful. I would be very nice to keep that and be able to use osvvm.

JimLewis commented 2 years ago

Is your goal to be able to run your older scripts with minimal changes or is it to get the same capability in OSVVM?

Currently OSVVM does not give you the option to set the waveform filename. Instead it looks for one out of a proscribed list. If your requirement is to be able to specify your own script file, that can be done, I am just not convinced of its value. Your top-level design is named tb_top_wrapper. If you include in your simulation directory, in the OSVVM script directory, or in the directory that contains your *.pro file, a file named tb_top_wrapper.tcl, then it will be run by the OSVVM scripts as they are today. Likewise for wave.do.

Do you ever add more than one wave file? Your "foreach" suggests that you do. Do you expect this setting to persist for only one simulation or multiple simulations?

The problem with the do command above is that it is not how TCL programs really work. I think that is more of the Siemens "do" style macro scripting. For TCL, the above would be:


proc run_simulation {LibraryUnit {WaveFiles ""} {SimTimeGeneric ""}} {
  # the next line would require that you overload OSVVM's SimulateRunScripts, see blog post referenced below.   
  set ::osvvm::WaveFileList $WaveFiles  
  if {$SimTimeGeneric ne""} {
    set SimOptions [generic SIM_TIME $SimTimeGeneric]
  } else {
    set SimOptions ""
  }
  simulate $LibraryUnit {*}$SimOptions 
}

It is also potential that the current approach would use the WaveFileList and iterate it.   The default setting would be "wave.do".

While I was thinking about your original post, I wrote a blog on OSVVM.org trying to anticipate where you were stuck.   It is here:  https://osvvm.org/archives/2075
JimLewis commented 2 years ago

@aakulinsk
I just did a little more playing around. The OSVVM simulate command format is:

simulate LibraryUnit args

In TCL, args is a way to handle an unspecified number of parameters. As a result, you can call simulate as follows:

simulate tb_top_wrapper [generic SIM_TIME 1ns] -do wave1.do -do wave2.do

If you want something that is closer to your run_simulation, but still takes advantage of OSVVM allowing additional commands at the end of simulate, you can add a script such as the following to the file LocalScriptDefaults.tcl. I made one change from the blog post, I put this before the OSVVM context:

  proc run_simulation {LibraryUnit {WaveFiles ""} {SimTimeGeneric ""} args} {
    set SimOptions ""
    if {$SimTimeGeneric ne ""} {
      append SimOptions [generic SIM_TIME $SimTimeGeneric] " "
    }
    if {$WaveFiles ne ""} {
      foreach wave $WaveFiles {
        append SimOptions "-do $wave " 
      }
    }
    if {$args ne ""} {
      append SimOptions {*}$args
    }

    simulate $LibraryUnit {*}$SimOptions
  }

namespace eval ::osvvm {
. . . 

I test drove this by changing the call to simulate to:

    puts "simulate $LibraryUnit {*}$SimOptions"

One of the goals of the OSVVM scripting is to avoid simulator dependent actions where ever possible. The following aspect of the above code is modelsim dependent:

      foreach wave $WaveFiles {
        append SimOptions "-do $wave " 
      }

This could be handled in the same fashion to how we handle generics - by calling the generic function. So we could add a function to OSVVM, perhaps named DoWaves, that takes a list of waveform files as an argument. This would make the call to simulate as follows:

simulate tb_top_wrapper [generic SIM_TIME 1ns]  [DoWaves wave1.do wave2.do]

This allows us to do different things for different simulators by doing different actions in a vendor_DoWaves script.

Let me know if any of this addresses your issue and if you would like me to implement the DoWaves capability.

aakulinsk commented 2 years ago

My goal is to get the same capability in OSVVM.

When simulating big designs I like to keep waveforms for each module's instance in separate waveform file. So used waveform files depend on which functionalities are tested in the moment.

Solution that you've proposed looks very promising. It will allow to use multiple waveform source files, which is what I am looking for. So yes, it would be very nice to have DoWaves capability.

aakulinsk commented 2 years ago

Additional question, is there a way to disable osvvm failure: FAILED : Simulate Did Not Run

If I would like to compile all source files in one .pro script and then separately run the simulation (e.g. from modelsim command prompt), using suggested:

simulate tb_top_wrapper [generic SIM_TIME 10us]
JimLewis commented 2 years ago

I am guessing you are seeing this in a Build Summary Report.

In OSVVM, we refer to the test case name twice. Once in the script file and once in the VHDL code. These names need to match or the "FAILED: Simulate Did Not Run" is produced.

Why does OSVVM expect a correlation of the names? In the past, I have used a methodology where the test case / test sequencer is an architecture of a lower level block (TestCtrl) inside the TestHarness (maybe like your tb_top_wrapper). To simulate, an architecture of TestCtrl is compiled, and then when we say simulate TestHarness and the most recent successfully compiled architecture is loaded. The problem is if that architecture did not compile, then some other test case is run instead. Hence, the correlation allows us to get a test failure if this were to happen.

I use these scripts to run the regression tests for the OSVVM libraries. Currently there are 586 test cases. It is way beyond what can be reviewed by hand - except for any failures. Hence, we need this sort of detailed checking in the scripts.

As a result, to avoid this message in the scripts we need to set the test case name in the scripts by doing the following. This assumes that the test case is in an architecture of TestCtrl that we compile just before running a simulation:

TestCase Test1
analyze TestCtrl_Test1.vhd
simulate tb_top_wrapper 

Then in TestCtrl_Test1.vhd, in the ControlProc, we set the test name:

ControlProc : process
begin
  -- Initialization of test
  SetTestName("Test1") ; -- Set the test name
  . . . 
  EndOfTestReports ;
  std.env.stop ;
  wait ;  -- prevents "possible infinite loop warnings"
end process ControlProc ; 

Note historically SetTestName was called SetAlertLogName (and that call is still supported via an alias).

I will take a look at DoWaves and see if I can get it in the 2022.10 release (which should happen this week).

aakulinsk commented 2 years ago

Sorry I've got a lost here. From my understanding the name of TestCase in the run .pro script has to be the same as the SetTestName/SetAlertLogName used in test sequencer. Is there something else I am missing?

Other question to clarify, taking the OsvvmLibraries/AXI4/Axi4/RunDemoTests.pro script as an example:

RunTest  ./TestCases/TbAxi4_MemoryReadWriteDemo1.vhd

Can the command RunTest in the script be replaced by:

TestCase TbAxi4_MemoryReadWriteDemo1
analyze ./TestCases/TbAxi4_MemoryReadWriteDemo1.vhd
simulate TbAxi4_MemoryReadWriteDemo1 

Btw. Thanks for looking into DoWaves!

JimLewis commented 2 years ago

Sorry I've got a lost here. From my understanding the name of TestCase in the run .pro script has to be the same as the SetTestName/SetAlertLogName used in test sequencer. Is there something else I am missing?

OOPS. Sorry. Cut and paste error. Yes they must be the same.

Can the command RunTest in the script be replaced by:

Yes. RunTest is just a short hand for exactly that. I made RunTest for my use case where I run the test using a configuration declaration and I name the file that contains the architecture of TestCtrl and the Configuration (at the bottom) the same as the configuration name. It is also good for simple, small test benches.

aakulinsk commented 2 years ago

Great! So this will be a good base to show what I am trying to do. Actually, it may go against library scripting approach to compile and run all simulation source files using one script. Because I want to compile source files using .pro script and then separately run simulation using simulate command. But maybe there exists some way for me to achieve this.

To try out if it is possible I've taken an example from repository OsvvmLibraries/AXI4/Axi4/RunDemoTests.pro. Since I can assume that testbench and compilation scripts are written correctly.

Then I simplified the script, to run only one example (TbAxi4_MemoryReadWriteDemo1) and replaced the RunTest command with three equivalent commands. After changes the RunDemoTest.pro contained:

TestSuite Axi4Full
library osvvm_TbAxi4

include ./testbench

TestCase TbAxi4_MemoryReadWriteDemo1
analyze ./TestCases/TbAxi4_MemoryReadWriteDemo1.vhd
simulate TbAxi4_MemoryReadWriteDemo1 

Here I've tried to build & run simulation and everything worked fine.

Next I removed simulate command from RunDemoTest.pro script. so it contained:

TestSuite Axi4Full
library osvvm_TbAxi4

include ./testbench

TestCase TbAxi4_MemoryReadWriteDemo1
analyze ./TestCases/TbAxi4_MemoryReadWriteDemo1.vhd

and built the simulation. Source files built correctly but osvvm generated an error:

# BuildError: Axi4_RunDemoTests FAILED,  Passed: 0,  Failed: 1,  Skipped: 0,  Analyze Errors: 0,  Simulate Errors: 0,  Build Error Code: 0

the library even generated all the reports, during the compilation step (without running current simulation), I guess based on previous run. There, the error is shown and described in Axi4_RunDemoTests.html: FAILED : Simulate Did Not Run. Which is in the end very true.

Next I run simulation from command prompt using:

simulate TbAxi4_MemoryReadWriteDemo1 

this updates simulation results shown in /reports/Axi4Full/TbAxi4_MemoryReadWriteDemo1.html report

but the failure Simulate Did Not Run in Axi4_RunDemoTests.html report remains.

So here is my question, would it be possible to change simulate command that it removes Simulate Did Not Run failure?

JimLewis commented 2 years ago

Short Answer:
Don't put TestCase in your RunDemoTest.pro unless you are also running simulate from there.

Why?
In a Build Summary Report, TestCase tells the reporting to put the TestCase name in the Build Summary Report. However, since simulate did not run, there are no completion results for that test case, so it reports as failed.

Why do we do this? Sometimes a simulation stops without reaching the point of completion and as a result, does not produce any passed or failed message. This ensures that the test reports as failed when it fails to reach the point of calling EndOfTestReports.

Also note if you intend to run and debug interactively, you may wish to do

SetInteractiveMode
aakulinsk commented 2 years ago

I tried your solution. This time the build script failed without any error message:

# BuildError: Axi4_RunDemoTests FAILED,  Passed: 0,  Failed: 0,  Skipped: 0,  Analyze Errors: 0,  Simulate Errors: 0,  Build Error Code: 0

and TbAxi4_MemoryReadWriteDemo1 TestCase completely disappeared from Axi4_RunDemoTests.html report

Nothing changed in the reports state after I run these two commands in the command prompt:

TestCase TbAxi4_MemoryReadWriteDemo1
simulate TbAxi4_MemoryReadWriteDemo1

So compering to the version from my last post I can't even access the .html logs from TbAxi4_MemoryReadWriteDemo1. Previously I could access correct simulation logs from Axi4_RunDemoTests.html, just the TestCase summary shown that TbAxi4_MemoryReadWriteDemo1 failed due to Simulate Did Not Run failure.

JimLewis commented 2 years ago

I am not sure why you got a BuildError. Something must have happened in the scripts. Follow the directions that follow BuildError to get a full error report. I think it is puts ::osvvm::BuildErrorInfo but what is on the screen supersedes this

WRT the information not being in the Build Summary Report, only things that are run during a given build are put in the report. OTOH, you will still get the Test Case HTML reports in the reports directory.

Off to a meeting.

JimLewis commented 2 years ago

When you are running simulate manually and you are running OSVVM, you get the passed failed message from the simulation in the form of the %% DONE that is printed by EndOfTestReports. Again the detailed reports are in the reports directory.

JimLewis commented 2 years ago

DoWaves is now in the dev branch.

JimLewis commented 2 years ago

DoWaves was released with 2022.10.

JimLewis commented 2 years ago

Closing issue. as DoWaves has been released.