Open tdavetech opened 2 months ago
What version of HDF5 are you using?
Here is the current config:
SUMMARY OF THE HDF5 CONFIGURATION
=================================
HDF5 Version: 1.8.12
Configured on: Thu Sep 16 02:22:24 UTC 2021
Configured by: mockbuild@buildhw-x86-14.iad2.fedoraproject.org
Configure mode: production
Host system: x86_64-redhat-linux-gnu
Uname information: Linux buildhw-x86-14.iad2.fedoraproject.org 5.12.14-300.fc34.x86_64 #1 SMP Wed Jun 30 18:30:21 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
Byte sex: little-endian
Libraries: static, shared
Installation point: /usr
Compilation Mode: production
C Compiler: /usr/bin/gcc ( gcc (GCC) 4.8.5 20150623 )
CFLAGS: -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic
H5_CFLAGS: -std=c99 -pedantic -Wall -Wextra -Wundef -Wshadow -Wpointer-arith -Wbad-function-cast -Wcast-qual -Wcast-align -Wwrite-strings -Wconversion -Waggregate-return -Wstrict-prototypes -Wmissing-prototypes -Wmissing-declarations -Wredundant-decls -Wnested-externs -Winline -Wfloat-equal -Wmissing-format-attribute -Wmissing-noreturn -Wpacked -Wdisabled-optimization -Wformat=2 -Wunreachable-code -Wendif-labels -Wdeclaration-after-statement -Wold-style-definition -Winvalid-pch -Wvariadic-macros -Winit-self -Wmissing-include-dirs -Wswitch-default -Wswitch-enum -Wunused-macros -Wunsafe-loop-optimizations -Wc++-compat -Wstrict-overflow -Wlogical-op -Wlarger-than=2048 -Wvla -Wsync-nand -Wframe-larger-than=16384 -Wpacked-bitfield-compat -Wstrict-overflow=5 -Wjump-misses-init -Wunsuffixed-float-constants -Wdouble-promotion -Wsuggest-attribute=const -Wtrampolines -Wstack-usage=8192 -Wvector-operation-performance -Wsuggest-attribute=pure -Wsuggest-attribute=noreturn -Wsuggest-attribute=format -O3 -fomit-frame-pointer -finline-functions
AM_CFLAGS:
CPPFLAGS:
H5_CPPFLAGS: -D_POSIX_C_SOURCE=199506L -DNDEBUG -UH5_DEBUG_API
AM_CPPFLAGS: -D_LARGEFILE_SOURCE -D_LARGEFILE64_SOURCE -D_BSD_SOURCE
Shared C Library: yes
Static C Library: yes
Statically Linked Executables: no LDFLAGS: -Wl,-z,relro H5_LDFLAGS: AM_LDFLAGS: Extra libraries: -lsz -lz -ldl -lm Archiver: ar Ranlib: ranlib Debugged Packages: API Tracing: no
Fortran: yes
Fortran Compiler: /usr/bin/gfortran ( GNU Fortran (GCC) 4.8.5 20150623 )
Fortran 2003 Compiler: yes
Fortran Flags: -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -I/usr/lib64/gfortran/modules
H5 Fortran Flags:
AM Fortran Flags:
Shared Fortran Library: yes
Static Fortran Library: yes
C++: yes
C++ Compiler: /usr/bin/g++ ( g++ (GCC) 4.8.5 20150623 )
C++ Flags: -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic
H5 C++ Flags:
AM C++ Flags:
Shared C++ Library: yes
Static C++ Library: yes
Parallel HDF5: no
High Level library: yes
Threadsafety: no
Default API Mapping: v18
With Deprecated Public Symbols: yes I/O filters (external): deflate(zlib),szip(encoder) I/O filters (internal): shuffle,fletcher32,nbit,scaleoffset MPE: no Direct VFD: no dmalloc: no Clear file buffers before write: yes Using memory checker: no Function Stack Tracing: no GPFS: no Strict File Format Checks: no Optimization Instrumentation: no Large File Support (LFS): yes
What version of HDF5 are you using?
1.8.12
I suspect that the error you see is because you are using an old HDF5 version, HDF5 has changed its memory allocation mechanisms over the years. Any chance you can use HDF5 1.10.x or 1.12.x or 1.14.x to see if the problem goes away?
I suspect that the error you see is because you are using an old HDF5 version, HDF5 has changed its memory allocation mechanisms over the years. Any chance you can use HDF5 1.10.x or 1.12.x or 1.14.x to see if the problem goes away?
Possibly. My current version matches the version needed by the netcdf version I am trying to install though. Would it still be worth a shot?
I'll be taking a look to see if we need to bump to 1.10.x as a minimum version with the next release candidate; I was initially hesitant to do this (for many years) given the performance regressions observed in 1.10.x+ in parallel environments, but it looks like this has been largely resolved by the latest 1.14.x+ code.
I've just checked the documentation for h5allocate_memory()
, it was introduced in HDF5 1.8.15. So you'll need to bump up to that version at least, and I will correct our documentation.
I've just checked the documentation for
h5allocate_memory()
, it was introduced in HDF5 1.8.15. So you'll need to bump up to that version at least, and I will correct our documentation.
Is it fairly safe to update HDF5? I mean I guess you could always roll back to the previous version but what's the likelyhood that it breaks something in the first place?
I mean, it depends what libraries are linked against it, which can be tricky. If it's only netCDF, you can upgrade in place and reconfigure/recompile netCDF from source (or, compile for the first time in this case) without issue.
A code search suggests that H5allocate_memory
is used only in filter plugins, and the remote byterange driver. Therefore it may be possible to use an older HDF5 version such as 1.8.12, by disabling these features when building netcdf-C. Test by adding this to your configure step, then rebuild netcdf-C.
--disable-byterange
--disable-plugins
PR #3009 updated the minimum required HDF5 version to 1.8.15. This issue can be closed.
Hello, I am trying to install version 4.9.2 on Centos 7 and am getting this error when running make.