mfem / PyMFEM

Python wrapper for MFEM
http://mfem.org
BSD 3-Clause "New" or "Revised" License
221 stars 62 forks source link

Tests are skipped (+ passed) on cuda build #229

Open justinlaughlin opened 4 months ago

justinlaughlin commented 4 months ago

Lately, ex18 and ex23 have been causing tests to fail. I noticed that on the CUDA build, no tests run on either mfem or pymfem. This leads to a PASS(-1) because "generated files are the same". This should probably be a fail.

Here is an example:

https://github.com/mfem/PyMFEM/actions/runs/9752638384/job/26916497223

Running : ex0.py
/opt/hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/mfem/external/ser/examples/ex0: error while loading shared libraries: libcusparse.so.12: cannot open shared object file: No such file or directory   File "/opt/hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/mfem/ser.py", line 2, in <module>           : False
No difference in generate files (Passed output file check) in exe
ex0  PASS(-1): generated files are the same, but terminal output differs
sshiraiwa commented 3 months ago

Closing this, since it was addressed with 4.7 release.

justinlaughlin commented 3 months ago

I think this could stay open - the 4.7 release fixed CUDA installation. This issue is that CUDA tests will always pass when in fact they aren't actually running but look like they are (at least, their output is PASS(-1)). We still see this now, e.g. in this run

sshiraiwa commented 3 months ago

You are right. Reopened. This looks like both examples (C++ and Python) are not running due to run-time library linking issue?

justinlaughlin commented 3 months ago

Looks like it. We might also want to consider changing the test behavior so that instead of PASS(-1) it is a FAIL - current behavior is easy to slip under the radar. If it is common/okay for terminal output to be different, maybe we can add a condition that if there are no files generated it is a fail?