Parallel-NetCDF / PnetCDF

Source code repository of PnetCDF library and utilities
https://parallel-netcdf.github.io
Other
80 stars 22 forks source link

skipping incompatible /usr/lib libraries #101

Open cponder opened 1 year ago

cponder commented 1 year ago

I"m building on a Cray system running SLES 15 SP4. The build gives ma a lot of these messages

/usr/bin/ld: skipping incompatible /usr/lib/librt.so when searching for -lrt
/usr/bin/ld: skipping incompatible /usr/lib/libutil.so when searching for -lutil
/usr/bin/ld: skipping incompatible /usr/lib/libdl.so when searching for -ldl
/usr/bin/ld: skipping incompatible /usr/lib/libpthread.so when searching for -lpthread
/usr/bin/ld: skipping incompatible /usr/lib/libc.so when searching for -lc
/usr/bin/ld: skipping incompatible /usr/lib/libm.so when searching for -lm

I know these are just warnings, but my builds under Ubuntu never did this.

cponder commented 1 year ago

I believe the problem is that it's searching under /usr/lib before searching /usr/lib64. However, I don't see this in the build information:

CFLAGS   = -fPIC -m64 -tp=px
CXXFLAGS = -fPIC -m64 -tp=px
FFLAGS   = -I/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/include -I/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/include/CL -I/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/extras/CUPTI/include -I/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/extras/Debugger/include -I/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/nvvm/include -I/global/common/software/nvendor/perlmutter/SHARE/Utils/HWLoc/2.8.0/PGI-23.1_CUDA-12.0.1.0_525.85.12/include -I/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_UCX-1.13.1_HWLoc-2.8.0_ZLib-1.2.13/include  -I/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_UCX-1.13.1_HWLoc-2.8.0_ZLib-1.2.13/lib
FCFLAGS  = -fPIC -m64 -tp=px
LDFLAGS  = -L/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/lib64 -L/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/driver -L/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/extras/CUPTI/lib64 -L/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/extras/Debugger/lib64 -L/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/nvvm/lib64 -L/global/common/software/nvendor/perlmutter/SHARE/Utils/HWLoc/2.8.0/PGI-23.1_CUDA-12.0.1.0_525.85.12/lib -L/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_UCX-1.13.1_HWLoc-2.8.0_ZLib-1.2.13/lib -L/global/common/software/nvendor/perlmutter/SHARE/Utils/PGI/23.1/Linux_x86_64/23.1/cuda/lib64 -lnvidia-ml
cponder commented 1 year ago

Likewise with the MPI compiles:

--> mpicc --show | xargs explode ' ' 
pgcc
-I/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_LIBFABRIC-1.15.2.0_HWLoc-2.8.0_ZLib-1.2.13/include
-L/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_LIBFABRIC-1.15.2.0_HWLoc-2.8.0_ZLib-1.2.13/lib
-L/global/common/software/nvendor/perlmutter/SHARE/Utils/HWLoc/2.8.0/PGI-23.1_CUDA-12.0.1.0_525.85.12/lib
-Wl,-rpath
-Wl,/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_LIBFABRIC-1.15.2.0_HWLoc-2.8.0_ZLib-1.2.13/lib
-Wl,-rpath
-Wl,/global/common/software/nvendor/perlmutter/SHARE/Utils/HWLoc/2.8.0/PGI-23.1_CUDA-12.0.1.0_525.85.12/lib
-Wl,--enable-new-dtags
-lmpi

--> mpifort --show | xargs explode ' ' 
pgfortran
-I/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_LIBFABRIC-1.15.2.0_HWLoc-2.8.0_ZLib-1.2.13/include
-I/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_LIBFABRIC-1.15.2.0_HWLoc-2.8.0_ZLib-1.2.13/lib
-L/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_LIBFABRIC-1.15.2.0_HWLoc-2.8.0_ZLib-1.2.13/lib
-L/global/common/software/nvendor/perlmutter/SHARE/Utils/HWLoc/2.8.0/PGI-23.1_CUDA-12.0.1.0_525.85.12/lib
-Wl,-rpath
-Wl,/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_LIBFABRIC-1.15.2.0_HWLoc-2.8.0_ZLib-1.2.13/lib
-Wl,-rpath
-Wl,/global/common/software/nvendor/perlmutter/SHARE/Utils/HWLoc/2.8.0/PGI-23.1_CUDA-12.0.1.0_525.85.12/lib
-Wl,--enable-new-dtags
-lmpi_usempif08
-lmpi_usempi_ignore_tkr
-lmpi_mpifh
-lmpi
cponder commented 1 year ago

And the paths:

--> explode : $LD_LIBRARY_PATH
/global/common/software/nvendor/perlmutter/SHARE/Utils/OpenMPI/4.1.5/PGI-23.1_CUDA-12.0.1.0_525.85.12_LIBFABRIC-1.15.2.0_HWLoc-2.8.0_ZLib-1.2.13/lib
/opt/cray/libfabric/1.15.2.0/lib64
/global/common/software/nvendor/perlmutter/SHARE/Utils/HWLoc/2.8.0/PGI-23.1_CUDA-12.0.1.0_525.85.12/lib
/global/common/software/nvendor/perlmutter/SHARE/Utils/PGI/23.1/Linux_x86_64/23.1/compilers/lib
/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/nvvm/lib64
/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/extras/Debugger/lib64
/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/extras/CUPTI/lib64
/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/lib64
/lib64
/usr/lib64
/lib
/usr/lib
/usr/local/lib
/usr/libexec
/usr/local/libexec
/global/homes/c/cponder/PerlMutter/lib
/global/common/software/nvendor/perlmutter/SHARE/Utils/CUDA/12.0.1.0_525.85.12/driver
wkliao commented 1 year ago

It appears to me you are running on Perlmutter at NERSC. Could you please provide the information about loaded modules, i.e. output of command "module list", and the configure command used.