Open ekernf01 opened 6 years ago
Similar issues in a different project seem to suggest I could just rename the file. Example:
https://github.com/BVLC/caffe/issues/1463
For now, I'm punting, but I will report back if I try renaming.
For some reason autoconf seems to pick up version 1.10.2 instead of 1.8.20
checking for HDF5 libraries... yes (version 1.10.2)
Mabe some path is wrong?
I'm hitting this issue too. As far as I can tell, everything in my hdf5 configuration is correct. I'm running RedHat Linux 3.10.0-957.1.3.el7.x86_64. I compiled hdf5 from source using these settings:
../configure --prefix=/usr/local --enable-fortran --enable-cxx
Here's what I'm seeing when attempting to install hdf5r v1.0.1 in R:
checking for a sed that does not truncate output... /usr/bin/sed
checking for gawk... gawk
checking for grep that handles long lines and -e... /usr/bin/grep
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking how to run the C preprocessor... gcc -E
checking for egrep... /usr/bin/grep -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking for h5cc... /usr/local/bin/h5cc
checking for HDF5 libraries... yes (version 1.10.4)
checking hdf5.h usability... yes
checking hdf5.h presence... yes
checking for hdf5.h... yes
checking for H5Fcreate in -lhdf5... yes
checking for hdf5_hl.h... yes
checking for H5LTpath_valid in -lhdf5_hl... yes
checking for main in -lhdf5_hl... yes
checking for matching HDF5 Fortran wrapper... /usr/local/bin/h5fc
Found hdf5 with version: 1.10.4
checking for ggrep... /usr/bin/grep
checking whether /usr/bin/grep accepts -o... yes
checking for ggrep... (cached) /usr/bin/grep
checking whether /usr/bin/grep accepts -o... yes
configure: creating ./config.status
config.status: creating src/Makevars
[...]
Error: package or namespace load failed for ‘hdf5r’ in dyn.load(file, DLLpath = DLLpath, ...):
unable to load shared object '/XXX/hdf5r/libs/hdf5r.so':
libhdf5_hl.so.100: cannot open shared object file: No such file or directory
If I install hdf5 using yum, it works but I can only build hdf5r v1.0.0, since the hdf5 version is too old:
yum install hdf5-devel
# hdf5-1.8.12-10.el7.x86_64
hdf5r requires hdf5 >= 1.8.13, so this becomes difficult to install on RedHat unless we compile from source.
This package is a dependency for Seurat, which is a very popular tool for single-cell RNA-seq analysis. Has anybody else seen this on a different Linux distribution? @satijalab @andrewwbutler @mojaveazure
Regarding the hdf5r.so
error above, here's how to fix that issue. If you've installed an updated version of hdf5 (for example into /usr/local
), you need to ensure that LD_LIBRARY_PATH
in R contains the corresponding lib
path (e.g. /usr/local/lib
). The recommended method is to set LD_LIBRARY_PATH
in your ~/.Renviron
file.
Here's how to check it in R:
Sys.getenv("LD_LIBRARY_PATH")
Note that LD_LIBRARY_PATH
should not be set inside .bashrc
, use the .Renviron
method instead.
@ekernf01 - I think your error might have had something to do with Anaconda/Miniconda
I'm commenting here in case anyone else stumbles on the same problem.
The issue is that anaconda exposes some files on the path related to libhdf5, but does not expose the corresponding .so files on LD_LIBRARY_PATH. This messes up the configuration script because it will look for the wrong versions of things.
The fix is easy if you have an updated conda
executable. Just run conda deactivate
to get out of any conda environment before running R. Then launch R and install the package. In general, I always try to do this before doing any installation that requires compilation (but had forgotten to today on a new system).
Alternatively, you can customize your .Renviron
file so that conda isn't in the PATH at all, and you don't have to worry about environment deactivation. Here's what I'm currently doing in my Renviron.site
file for my machines:
PATH="/usr/local/bin:/usr/bin:/bin"
If you're on Linux and have admin access, make sure all of your lib/
directories are configured correctly for ld
(ldconfig
), which requires files in /etc/ld.so.conf.d/
. Then you don't have to worry about LD_LIBRARY_PATH
.
If there is anyone like me is using remote server. If this error happens, I recommend you to create a new conda environment and install packages again. It would be really helpful. credit to this answer!! https://www.biostars.org/p/498049/
Hi @hhoeflin , is it possible to call dyn.load("...../libhdf5_hl.so.100")
in .onLoad
function manually? Usually on Linux servers (like Redhat), HDF5 are installed as Linux module. The dynamic libraries might not be necessarily loaded (similar to this case, even compiler flag is set correctly, the HDF5 library might not be added to ld.so cache). One solution could be to explicitly call dyn.load()
when the package is loaded.
My current solution on our server is:
LD_LIBRARY_PATH
in the .Renvirondyn.load
to load libhdf5....so.100library(hdf5r)
@ekernf01 - I think your error might have had something to do with Anaconda/Miniconda
I'm commenting here in case anyone else stumbles on the same problem.
The issue is that anaconda exposes some files on the path related to libhdf5, but does not expose the corresponding .so files on LD_LIBRARY_PATH. This messes up the configuration script because it will look for the wrong versions of things.
The fix is easy if you have an updated
conda
executable. Just runconda deactivate
to get out of any conda environment before running R. Then launch R and install the package. In general, I always try to do this before doing any installation that requires compilation (but had forgotten to today on a new system).
Thanks a lot, that solution worked for me!
@ekernf01 - I think your error might have had something to do with Anaconda/Miniconda
I'm commenting here in case anyone else stumbles on the same problem.
The issue is that anaconda exposes some files on the path related to libhdf5, but does not expose the corresponding .so files on LD_LIBRARY_PATH. This messes up the configuration script because it will look for the wrong versions of things.
The fix is easy if you have an updated
conda
executable. Just runconda deactivate
to get out of any conda environment before running R. Then launch R and install the package. In general, I always try to do this before doing any installation that requires compilation (but had forgotten to today on a new system).
Thank you for this tip, it finally worked! :heart:
This error message is very similar to #94, but I tried the fix from vondoReshi and it doesn't solve this. Note that the filename ends in
100
, not10
.Prior to running
install.packages("hdf5r")
, I downloaded hdf5 1.8.20, extracted it, and ran the following commands../configure --prefix=<my_home_folder> --enable-fortran --enable-cxx
make
make check
make install
I made sure that
echo $LD_LIBRARY_PATH
includes<my_home_folder>/lib
, so I was hoping R would be able to find all the.o
and.so
files it needs. As best I can tell by googling, there is something in a header file that is throwing off the C++ compiler's header checking, but I am probably way off. Do you have any suggestions for how to proceed? Thanks!The exact error:
The full log:
My
sessionInfo()
: