sstsimulator / sst-macro

SST Macro Element Library
http://sst-simulator.org/
Other
34 stars 41 forks source link

TERMINATED error on "make check" #674

Closed sunsirui closed 2 years ago

sunsirui commented 2 years ago

Dear All: After I changed some code, build wih "make check", the error is as follows:

/home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/testsuite_mpi_82.chk-out TERMINATED /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/testsuite_mpi_83.chk-out TERMINATED /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/testsuite_mpi_103.chk-out TERMINATED /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/testsuite_mpi_104.chk-out TERMINATED /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/testsuite_mpi_115.chk-out TERMINATED /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/testsuite_mpi_190.chk-out TERMINATED /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/testsuite_mpi_207.chk-out TERMINATED /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/testsuite_mpi_239.chk-out TERMINATED 144 of 157 tests failed Makefile:1928: recipe for target 'check-local' failed make[3]: *** [check-local] Error 1

But "make install" succeeded.
I don't know, what is the reason for "TERMINATED" on "make check", why does this error occur, and what is the impact?

Thanks to anyone who may shed some light into this. -sirui

jpkenny commented 2 years ago

If you e.g. cat /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/testsuite_mpi_82.* you should get some idea of why the tests are failing.

sunsirui commented 2 years ago

hello jpkenny:

when I try:
cat /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/tests/test_core_apps_ping_pong_snappr.*

FAILED: test_core_apps_ping_pong_snappr.chk-out: Line missing from CHK file: 20384: 0.4615 GB/sping-pong between 0 and 3 4: 0.0098 GB/s 8: 0.0190 GB/s 16: 0.0360 GB/s 32: 0.0656 GB/s 64: 0.1111 GB/s 128: 0.0863 GB/s 512: 0.1380 GB/s 1024: 0.3100 GB/s 2048: 0.2920 GB/s 4096: 0.2837 GB/s 8192: 0.2798 GB/s 20384: 0.4607 GB/s 40768: 0.4572 GB/s 81536: 0.4554 GB/s 163072: 0.4545 GB/s 326144: 0.4555 GB/s 652288: 0.4560 GB/s 1304576: 0.4558 GB/s ping-pong between 2 and 1 4: 0.0098 GB/s 8: 0.0190 GB/s 16: 0.0360 GB/s 32: 0.0656 GB/s 64: 0.1111 GB/s 128: 0.0863 GB/s 512: 0.1380 GB/s 1024: 0.3100 GB/s 2048: 0.2920 GB/s 4096: 0.2837 GB/s 8192: 0.2798 GB/s 20384: 0.4607 GB/s 40768: 0.4572 GB/s 81536: 0.4554 GB/s 163072: 0.4545 GB/s 326144: 0.4555 GB/s 652288: 0.4560 GB/s 1304576: 0.4558 GB/s Aggregate time stats: state Inactive: 1.61165 s idle:X: 0.01113 s active:X: 0.01042 s idle:injection: 0.01113 s active:injection: 0.01042 s Estimated total runtime of 0.01149142 seconds /home/sunsirui/pcie/origin/sstmacro-11.0.0/build/bin/.libs/sstmac --debug="" \ --configfile="../../tests/test_configs/test_ping_pong_snappr.ini" \

And, what is "CHK file",
I only added a delay that varies according to the length of incoming data in the sim_transport.c file :

double add_sec =  (-1E-12) * byteLength + (6E-08);
sstmac::TimeDelta new_addTime(add_sec);
if (new_addTime.ticks())
{
  parent_->compute(new_addTime);
}
jpkenny commented 2 years ago

Ok, make check is checking outputs against reference files. If you're changing the behavior of the network models then the outputs will change and the tests are no longer going to pass. That's all you're seeing.

sunsirui commented 2 years ago

So can I understand it this way, the reference file has a fixed output value, if I change a certain behavior during the execution of the function called by the reference file in sstmacro (behavior changes that affect the output result), resulting in a new result and reference file previous fixed output value is different, an error will be reported. But that doesn't mean the logic change in my code is wrong. yes?

jpkenny commented 2 years ago

Yes, if you change the behavior of the model the tests will fail, but all that tells you is that the behavior/output of the model has changed.

Problematic modifications of the model could also, for example, cause a deadlock or exception/abort (but that's not what was happening in the output you provided), so you have to inspect the test outputs to see why they are failing.

If you're forking the code base and modifying the models, you could update the reference files for future testing, but you need to be confident that your new reference outputs are correct.

sunsirui commented 2 years ago

Ok, I understand, thank you very much for your answer!