I don't have the bandwidth right now to minimize an example, so I'm reporting this for future me:
In the context of fusesoc, I have a target which generates an iverilog testbench. vlog_tb_utils requires a timescale to be defined for the timeout functionality to work.
The testbench depends on the vlog_tb_utils core as well asmy fork of the verilog-axis core. I also brought the source up to date with upstream.
I understand not hardcoding a timescale into this core, relying on, say, fusesoc's iverilogtimescale option to prepend a source file with your desired timescale. Unfortunately, every source in Verilog AXIs nowadays uses the resetall directive.
Depending on the source file order that fusesoc decides, this means that vlog_tb_utilsand any source file from any core in the list after verilog-axis core will lose the injected timescale information, leading to warnings from iverilog at best, and my simulations just plain never timing out (without extra code/me redefining the timescale directive locally).
Is there a way to influence core order? If this is too invasive, is it possible to add, say, a define that indicates I'm running the verilog-axis core under fusesoc to remove the resetalls? I am definitely open to options here.
I don't have the bandwidth right now to minimize an example, so I'm reporting this for future me:
In the context of
fusesoc
, I have a target which generates aniverilog
testbench.vlog_tb_utils
requires atimescale
to be defined for the timeout functionality to work.The testbench depends on the
vlog_tb_utils
core as well as my fork of theverilog-axis
core. I also brought the source up to date with upstream.I understand not hardcoding a
timescale
into this core, relying on, say,fusesoc
'siverilog
timescale
option to prepend a source file with your desired timescale. Unfortunately, every source in Verilog AXIs nowadays uses theresetall
directive.Depending on the source file order that
fusesoc
decides, this means thatvlog_tb_utils
and any source file from any core in the list afterverilog-axis
core will lose the injectedtimescale
information, leading to warnings fromiverilog
at best, and my simulations just plain never timing out (without extra code/me redefining thetimescale
directive locally).Is there a way to influence core order? If this is too invasive, is it possible to add, say, a define that indicates I'm running the
verilog-axis
core underfusesoc
to remove theresetall
s? I am definitely open to options here.