Describe the bug
When compiling HDF5 with NVHPC versions 23.5 - 23.9 (additional versions may also be applicable) and with -O1 (or higher) and -DNDEBUG, testing failures occur in the following tests:
H5PLUGIN-filter_plugin - incorrect metadata checksum after all read attempts
H5TEST-flush2 - Segmentation fault during Testing H5Fflush (part2 with flush + SWMR)
H5TEST-testhdf5-base - Many instances of incorrect metadata checksum after all read attempts during Testing -- Attributes (attr)
MPI_TEST_t_filters_parallel - malloc(): invalid next size (unsorted) after Testing fill time H5D_FILL_TIME_NEVER
Compiling without -DNDEBUG appears to pass testing, but is not ideal due to compiling in asserts and other library debug code. Since these tests pass with an optimization level of -O1 (and -O0) and it is currently unclear whether the test failures are due to issues in HDF5 or issues in the 'nvc' compiler, the maximum optimization level for NVHPC has been set to -O1 until the test failures can be resolved.
Note that even at -O1 optimization level, there still appears to be a sporadic test failure in the Java JUnit tests that has occasionally been seen in JUnit-TestH5Pfapl and JUnit-TestH5D. It is also unclear whether this is an issue in HDF5 or with the 'nvc' compiler. Testing of Java has been disabled in the NVHPC GitHub actions for now.
Note also that NVHPC version 23.9 will fail to compile the test file test/tselect.c with a use of undefined value compiler error when the optimization level used is -O2 or higher. Nvidia is aware of this issue and has suggested lowering the optimization level to -O1 for the time being: https://forums.developer.nvidia.com/t/hdf5-no-longer-compiles-with-nv-23-9/269045.
Expected behavior
HDF5 tests should pass with the NVHPC compiler collection at any optimization level.
Platform (please complete the following information)
We've heard that this is a compiler bug that will be resolved in the next release of the compiler. Kicking to 1.14.5 since there's nothing we can do about it now.
Describe the bug When compiling HDF5 with NVHPC versions 23.5 - 23.9 (additional versions may also be applicable) and with
-O1
(or higher) and-DNDEBUG
, testing failures occur in the following tests:incorrect metadata checksum after all read attempts
Segmentation fault
duringTesting H5Fflush (part2 with flush + SWMR)
incorrect metadata checksum after all read attempts
duringTesting -- Attributes (attr)
malloc(): invalid next size (unsorted)
afterTesting fill time H5D_FILL_TIME_NEVER
Compiling without
-DNDEBUG
appears to pass testing, but is not ideal due to compiling in asserts and other library debug code. Since these tests pass with an optimization level of-O1
(and-O0
) and it is currently unclear whether the test failures are due to issues in HDF5 or issues in the 'nvc' compiler, the maximum optimization level for NVHPC has been set to-O1
until the test failures can be resolved.Note that even at
-O1
optimization level, there still appears to be a sporadic test failure in the Java JUnit tests that has occasionally been seen inJUnit-TestH5Pfapl
andJUnit-TestH5D
. It is also unclear whether this is an issue in HDF5 or with the 'nvc' compiler. Testing of Java has been disabled in the NVHPC GitHub actions for now.Note also that NVHPC version 23.9 will fail to compile the test file
test/tselect.c
with ause of undefined value
compiler error when the optimization level used is-O2
or higher. Nvidia is aware of this issue and has suggested lowering the optimization level to-O1
for the time being: https://forums.developer.nvidia.com/t/hdf5-no-longer-compiles-with-nv-23-9/269045.Expected behavior HDF5 tests should pass with the NVHPC compiler collection at any optimization level.
Platform (please complete the following information)