GridOPTICS / GridPACK

https://www.gridpack.org/
44 stars 20 forks source link

Unable to install GridPACK python wrapper #83

Closed abhyshr closed 11 months ago

abhyshr commented 3 years ago

I am getting an error installing the python wrapper on Constance. Any ideas what I am doing wrong here?

(base) python:feature/hadrec$ python setup.py build
running build
running build_ext
-- The C compiler identification is GNU 4.9.2
-- The CXX compiler identification is GNU 4.9.2
-- Check for working C compiler: /share/apps/gcc/4.9.2/bin/gcc
-- Check for working C compiler: /share/apps/gcc/4.9.2/bin/gcc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /share/apps/gcc/4.9.2/bin/g++
-- Check for working CXX compiler: /share/apps/gcc/4.9.2/bin/g++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found PythonInterp: /share/apps/python/anaconda3.2019.3/bin/python (found version "3.7.3") 
-- Found PythonLibs: /share/apps/python/anaconda3.2019.3/lib/libpython3.7m.so
-- Performing Test HAS_CPP14_FLAG
-- Performing Test HAS_CPP14_FLAG - Success
-- pybind11 v2.4.3
statusGRIDPACK_HAVE_GOSS: OFF
statusGRIDPACK_GOSS_LIBRARY: 
-- Performing Test HAS_FLTO
-- Performing Test HAS_FLTO - Success
-- LTO enabled
-- Configuring done
-- Generating done
-- Build files have been written to: /qfs/people/abhy245/software/GridPACK/python/build/temp.linux-x86_64-3.7
Scanning dependencies of target parallel_scripts
[  0%] Built target parallel_scripts
Scanning dependencies of target gridpack
[ 50%] Building CXX object src/CMakeFiles/gridpack.dir/gridpack.cpp.o
[100%] Linking CXX shared module ../../lib.linux-x86_64-3.7/gridpack.cpython-37m-x86_64-linux-gnu.so
/share/apps/binutils/2.24/bin/ld: /people/abhy245/software/GridPACK/src/build_no_progress_ranks/lib/libgridpack_hadrec_module.a(hadrec_app_module.cpp.o): relocation R_X86_64_32 against `.rodata' can not be used when making a shared object; recompile with -fPIC
/people/abhy245/software/GridPACK/src/build_no_progress_ranks/lib/libgridpack_hadrec_module.a: error adding symbols: Bad value
collect2: error: ld returned 1 exit status
gmake[2]: *** [../lib.linux-x86_64-3.7/gridpack.cpython-37m-x86_64-linux-gnu.so] Error 1
gmake[1]: *** [src/CMakeFiles/gridpack.dir/all] Error 2
gmake: *** [all] Error 2
Traceback (most recent call last):
  File "setup.py", line 72, in <module>
    tests_require=['nose'],
  File "/share/apps/python/anaconda3.2019.3/lib/python3.7/site-packages/setuptools/__init__.py", line 145, in setup
    return distutils.core.setup(**attrs)
  File "/share/apps/python/anaconda3.2019.3/lib/python3.7/distutils/core.py", line 148, in setup
    dist.run_commands()
  File "/share/apps/python/anaconda3.2019.3/lib/python3.7/distutils/dist.py", line 966, in run_commands
    self.run_command(cmd)
  File "/share/apps/python/anaconda3.2019.3/lib/python3.7/distutils/dist.py", line 985, in run_command
    cmd_obj.run()
  File "/share/apps/python/anaconda3.2019.3/lib/python3.7/distutils/command/build.py", line 135, in run
    self.run_command(cmd_name)
  File "/share/apps/python/anaconda3.2019.3/lib/python3.7/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/share/apps/python/anaconda3.2019.3/lib/python3.7/distutils/dist.py", line 985, in run_command
    cmd_obj.run()
  File "setup.py", line 37, in run
    self.build_extension(ext)
  File "setup.py", line 56, in build_extension
    subprocess.check_call(['cmake', '--build', '.'] + build_args, cwd=self.build_temp)
  File "/share/apps/python/anaconda3.2019.3/lib/python3.7/subprocess.py", line 347, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['cmake', '--build', '.', '--config', 'Release', '--', '-j2']' returned non-zero exit status 2.
abhyshr commented 3 years ago

I think it has to do something related to building with shared libraries. How do I build GridPACK with shared libraries?

wperkins commented 3 years ago

@abhyshr To build the python wrapper you need to build GridPACK and all it's dependencies as shared libraries. Your build is trying to link to the static library /people/abhy245/software/GridPACK/src/build_no_progress_ranks/lib/libgridpack_hadrec_module.a. You need to go back and rebuild GA w/ shared libraries.

wperkins commented 3 years ago

I think it has to do something related to building with shared libraries. How do I build GridPACK with shared libraries?

-D BUILD_SHARED_LIBS=YES should be sufficient when building GridPACK.

abhyshr commented 3 years ago

Ok. @bjpalmer: I am using PETSc and GridPACK that you've installed on pic. Do these build shared libraries? If not, can you update them to build shared libraries?

abhyshr commented 3 years ago

Ohh boyy....All dependencies; GA, PETSc, Boost; are built with static libraries. This is going to be a long one :-(

abhyshr commented 3 years ago

@wperkins: Do you have instructions on how to build GA and Boost as shared libraries? I know what to do with PETSc.

wperkins commented 3 years ago

@abhyshr I e-mailed my notes to you. They are old. No guarantees.

bjpalmer commented 3 years ago

I haven't done it in a while, but my recollection is that it goes through pretty smoothly. I will build shared object versions in the pic/projects area

bjpalmer commented 3 years ago

Okay, I built shared object versions of petsc-3.8.4, boost-1.65.0 and ga-5.7 in /pic/projects/gridpack/software. Look for the -so extensions on the boost and petsc directories and for the build_pr_so and build_ts_so directories under ga-5.7. I tried running the gridpack test suite with these libraries and most of the applications pass, but there are a much higher number of failures than I see with static libraries. The environment I used is

module purge
module load gcc/6.1.0
module load openmpi/3.0.1
module load python/2.7.3
module load cmake/3.17.1
module load git
module load java
setenv CC gcc
setenv CFLAGS "-pthread"
setenv CXX g++
setenv CXXFLAGS "-pthread"
setenv FC gfortran
setenv FFLAGS "-pthread"
lzheng28 commented 3 years ago

I used these command to install them and can find the *.so (their shared libs) respectively

Install shared lib: boost(1.65.0):

cd boost_1_65_0
sh ./bootstrap.sh --prefix="/home/lei/software/boost_1_65_0" --without-icu --with-toolset=gcc --without-libraries=python,log

# Add the following to the end of project-config.jam:
# MPI
using mpi ;

./b2 -a -d+2 link=shared stage
sudo ./b2 -a -d+2 link=shared install

# Add the boost lib path
sudo vi /etc/ld.so.conf

GA: (5.8)

git clone https://github.com/GlobalArrays/ga.git
cd ga
sudo ./autogen.sh
sudo ./configure --with-mpi-ts --enable-cxx --disable-f77 --without-blas --enable-i4 --enable-shared --prefix="/home/lei/software/ga"
sudo make
sudo make install

Petsc:

git clone -b release https://gitlab.com/petsc/petsc.git petsc-v3.8.4
cd petsc-v3.8.4
git checkout v3.8.4

./configure --with-mpi-dir=/usr/local --with-c++-support=1 --with-c-support=0 --with-fortran=0 --with-scalar-type=complex --download-superlu --download-superlu_dist --download-mumps --download-parmetis --download-metis --download-f2cblaslapack=1 --download-suitesparse --with-clanguage=c++ --with-shared-libraries=1 --with-x=0 --with-mpirun=mpirun --with-mpiexec=mpiexec --with-debugging=1 --download-scalapack --with-cxx-dialect=C++11

make PETSC_DIR=/home/lei/Install_package/petsc-v3.8.4 PETSC_ARCH=arch-linux2-cxx-debug all
make PETSC_DIR=/home/lei/Install_package/petsc-v3.8.4 PETSC_ARCH=arch-linux2-cxx-debug test
make PETSC_DIR=/home/lei/Install_package/petsc-v3.8.4 PETSC_ARCH=arch-linux2-cxx-debug streams

export PETSC_DIR=/home/lei/Install_package/petsc-v3.8.4
export PETSC_ARCH=arch-linux2-cxx-debug

In build_config.sh, add

-D BUILD_SHARED_LIBS=YES
bjpalmer commented 11 months ago

Looks like this issue is resolved. Closing it.