In the ncrystaldev repo (and thus in the simple-build-dgcode CI) we have several unit tests that fail on osx due to tiny floating-point variations between macos and linux (or, clang vs. gcc?). Although it would be great to figure out the cause of these discrepancies, it is likely that this is simply a feature of floating point irreproducibilities, and we should at least make the unit tests more robust so that they do not flag issues that are irrelevant.
For reference, here is how I am currently avoiding these tests on osx:
if [ "$RUNNER_OS" == "macOS" ]; then
export tmp="${tmp}"',!sb_nclongextratests_testdosplot'
export tmp="${tmp}"',!sb_nclongextratests_testplots2'
export tmp="${tmp}"',!sb_nclongtests_testloadvdos'
export tmp="${tmp}"',!sb_nclongtests_testrange'
export tmp="${tmp}"',!sb_nclongtests_vdosloaddbg'
fi
sb -t --testexcerpts=100 --testfilter="{$tmp}"
In the ncrystaldev repo (and thus in the simple-build-dgcode CI) we have several unit tests that fail on osx due to tiny floating-point variations between macos and linux (or, clang vs. gcc?). Although it would be great to figure out the cause of these discrepancies, it is likely that this is simply a feature of floating point irreproducibilities, and we should at least make the unit tests more robust so that they do not flag issues that are irrelevant.
For reference, here is how I am currently avoiding these tests on osx: