GridOPTICS / GridPACK

https://www.gridpack.org/
44 stars 20 forks source link

Error during "make" GridPACK. #81

Closed lzheng28 closed 9 months ago

lzheng28 commented 3 years ago

Hello, I met an error when I make GridPACK. The system I use is Ubuntu 18.04. The following is the error info.

[ 66%] Linking CXX executable complex_linear_solver_test
cd /home/lei/Install_package/GridPACK/src/build/math && /usr/bin/cmake -E cmake_link_script CMakeFiles/real_dae_solver_test.dir/link.txt --verbose=1
/usr/bin/cmake: /usr/local/lib/libcurl.so.4: no version information available (required by /usr/bin/cmake)
/usr/bin/g++  -pthread  -rdynamic CMakeFiles/real_dae_solver_test.dir/test/dae_solver_test.cpp.o  -o real_dae_solver_test -Wl,-rpath,/usr/local/lib libgridpack_math.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libpetsc.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libcmumps.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libdmumps.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libsmumps.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libzmumps.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libmumps_common.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libpord.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libscalapack.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libumfpack.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libklu.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libcholmod.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libbtf.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libccolamd.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libcolamd.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libcamd.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libamd.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libsuitesparseconfig.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libsuperlu.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libsuperlu_dist.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libf2clapack.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libf2cblas.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libparmetis.a /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libmetis.a /usr/local/lib/libmpi_usempif08.so /usr/local/lib/libmpi_usempi_ignore_tkr.so /usr/local/lib/libmpi_mpifh.so /usr/local/lib/libmpi.so /usr/lib/gcc/x86_64-linux-gnu/7/libgfortran.so -lm -lgcc_s /usr/lib/gcc/x86_64-linux-gnu/7/libquadmath.so -lpthread /usr/lib/gcc/x86_64-linux-gnu/7/libstdc++.so /usr/lib/x86_64-linux-gnu/libdl.so ../configuration/libgridpack_configuration.a ../parallel/libgridpack_parallel.a ../timer/libgridpack_timer.a /home/lei/software/ga/lib/libga++.a /home/lei/software/ga/lib/libga.a /home/lei/software/ga/lib/libarmci.a /usr/lib/x86_64-linux-gnu/libboost_mpi.so /usr/lib/x86_64-linux-gnu/libboost_serialization.so /usr/lib/x86_64-linux-gnu/libboost_random.so /usr/lib/x86_64-linux-gnu/libboost_filesystem.so /usr/lib/x86_64-linux-gnu/libboost_system.so /usr/local/lib/libmpi.so /usr/lib/gcc/x86_64-linux-gnu/7/libgfortran.so -lm -lgcc_s /usr/lib/gcc/x86_64-linux-gnu/7/libquadmath.so -lpthread /usr/lib/gcc/x86_64-linux-gnu/7/libstdc++.so /usr/lib/x86_64-linux-gnu/libdl.so /home/lei/software/ga/lib/libga++.a /home/lei/software/ga/lib/libga.a /home/lei/software/ga/lib/libarmci.a /usr/lib/x86_64-linux-gnu/libboost_mpi.so /usr/lib/x86_64-linux-gnu/libboost_serialization.so /usr/lib/x86_64-linux-gnu/libboost_random.so /usr/lib/x86_64-linux-gnu/libboost_filesystem.so /usr/lib/x86_64-linux-gnu/libboost_system.so 
/usr/bin/ld: warning: libmpi.so.20, needed by /usr/lib/x86_64-linux-gnu/libboost_mpi.so, may conflict with libmpi.so.40
/usr/bin/ld: /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libzmumps.a(zmumps_f77.o)(.debug_info+0x3469): unresolvable R_X86_64_64 relocation against symbol `mpi_fortran_argv_null_'
/usr/bin/ld: final link failed: Nonrepresentable section on output
collect2: error: ld returned 1 exit status
math/CMakeFiles/real_dae_solver_test.dir/build.make:159: recipe for target 'math/real_dae_solver_test' failed
make[2]: *** [math/real_dae_solver_test] Error 1
make[2]: Leaving directory '/home/lei/Install_package/GridPACK/src/build'
CMakeFiles/Makefile2:748: recipe for target 'math/CMakeFiles/real_dae_solver_test.dir/all' failed
make[1]: *** [math/CMakeFiles/real_dae_solver_test.dir/all] Error 2
make[1]: Leaving directory '/home/lei/Install_package/GridPACK/src/build'
Makefile:165: recipe for target 'all' failed
make: *** [all] Error 2

And my cmake .sh is

rm -rf CMake*
CC=gcc
CXX=g++
CFLAGS="-pthread"
CXXFLAGS="-pthread"
FCFLAGS="-pthread"
F77=gfortran
F77FLAGS="-pthread"
export CC CXX CFLAGS CXXFLAGS FCFLAGS F77 F77FLAGS

cmake \
    -D PETSC_ARCH:STRING="arch-linux-cxx-debug" \
    -D PETSC_DIR:STRING="/home/lei/Install_package/petsc-v3.11.2" \
    -D PARMETIS_DIR:PATH="/usr" \
    -D MPI_CXX_COMPILER:STRING="mpicxx" \
    -D GA_DIR:STRING='/home/lei/software/ga' \
    -D MPI_C_COMPILER:STRING="mpicc" \
    -D MPIEXEC:STRING="mpiexec" \
    -D MPIEXEC_MAX_NUMPROCS:STRING="2" \
    -D GRIDPACK_TEST_TIMEOUT:STRING=60 \
    -D USE_GLPK:BOOL=ON \
    -D GLPK_ROOT_DIR:PATH="/usr" \
    -D BUILD_SHARED_LIBS:BOOL=OFF \
    -D CMAKE_INSTALL_PREFIX:PATH="$HOME/gridpack" \
    -D CMAKE_VERBOSE_MAKEFILE:BOOL=TRUE \
    ..

And after executing it, the info is

lei@lei-VirtualBox:~/Install_package/GridPACK/src/build$ sudo ./build_config3.shcmake: /usr/local/lib/libcurl.so.4: no version information available (required by cmake)
CMake Deprecation Warning at CMakeLists.txt:21 (cmake_policy):
  The OLD behavior for policy CMP0026 will be removed from a future version
  of CMake.

  The cmake-policies(7) manual explains that the OLD behaviors of all
  policies are deprecated and that a policy should be set to OLD only under
  specific short-term circumstances.  Projects should be ported to the NEW
  behavior and not rely on setting a policy to OLD.

-- The C compiler identification is GNU 7.5.0
-- The CXX compiler identification is GNU 7.5.0
-- Check for working C compiler: /usr/bin/gcc
-- Check for working C compiler: /usr/bin/gcc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/g++
-- Check for working CXX compiler: /usr/bin/g++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Checking MPI ...
MPI path
-- Found MPI_C: /usr/local/lib/libmpi.so (found version "3.1") 
-- Found MPI_CXX: /usr/local/lib/libmpi.so (found version "3.1") 
-- Found MPI: TRUE (found version "3.1")  
-- MPI_CXX_LIBRARIES: 
-- Checking Boost ...
-- Boost version: 1.65.1
-- Found the following Boost libraries:
--   mpi
--   serialization
--   random
--   filesystem
--   system
-- Checking PETSc ...
-- petsc_lib_dir /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib
-- Recognized PETSc install with single library for all packages
-- Performing Test MULTIPASS_TEST_1_petsc_works_minimal
-- Performing Test MULTIPASS_TEST_1_petsc_works_minimal - Failed
-- Performing Test MULTIPASS_TEST_2_petsc_works_allincludes
-- Performing Test MULTIPASS_TEST_2_petsc_works_allincludes - Failed
-- Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries
-- Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries - Success
-- PETSc only need minimal includes, but requires explicit linking to all dependencies.  This is expected when PETSc is built with static libraries.
-- Found PETSc: /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/include;/home/lei/Install_package/petsc-v3.11.2/include;/usr/local/include (found version "3.11.2") 
-- Using PETSc version 3.11.2
-- PETSc installation is double precision (--with-precision=double) -- good
-- PETSc installation uses complex type (--with-scalar-type=complex)
-- PETSc installation uses C++ (--with-clanguage=c++) -- we can work with that.
-- PETSc parallel LU linear solver will be from SuperLU_dist
-- Checking ParMETIS ...
-- Performing Test PARMETIS_TEST_RUNS
-- Performing Test PARMETIS_TEST_RUNS - Success
-- Found ParMETIS: /home/lei/Install_package/petsc-v3.11.2/arch-linux-cxx-debug/lib/libparmetis.a (found version "4.0") 
-- Checking GA ...
-- GA_INCLUDE_DIR: /home/lei/software/ga/include
-- GA_LIBRARY: /home/lei/software/ga/lib/libga.a
-- GA_CXX_LIBRARY: /home/lei/software/ga/lib/libga++.a
-- ARMCI_LIBRARY: /home/lei/software/ga/lib/libarmci.a
-- Performing Test GA_TEST_RUNS
-- Performing Test GA_TEST_RUNS - Success
-- Found GA: /home/lei/software/ga/lib/libga.a  
-- Checking glpk ...
-- Found GLPK: /usr/local/lib/libglpk.so  
-- Found Doxygen: /usr/local/bin/doxygen (found version "1.9.2 (cd998a7164e30cee896cccd190846b79ebb4355f)") found components:  doxygen dot 
-- Configuring done
-- Generating done
-- Build files have been written to: /home/lei/Install_package/GridPACK/src/build

I think that means each package is found successfully. So could you please help me to solve this problem? Is there any dependency I didn't install or the version is not fit for GridPACK? Thank you so much.

And I also find a strange thing that I install openmpi v4.1.0, but the info shows that the mpi is v3.1.0.

wperkins commented 3 years ago

You appear to be using a non-system MPI (/usr/local/lib/libmpi.so), but using the system Boost. If you want to keep using your custom MPI, you will need to build Boost from source. Use the CentOS 6 instructions as a guide. Then, specify -D BOOST_ROOT when configuring GridPACK.

bjpalmer commented 3 years ago

I think you may also run into problems using Open MPI 4.1.0. GridPACK has problems building with versions of PETSc greater than 3.10 that we haven't had an opportunity to fix, but lower versions of PETSc don't compile with Open MPI 4.1.0 due to some functions that are completely deprecated in Open MPI after 4.0. Can you use an earlier version of Open MPI?

lzheng28 commented 3 years ago

You appear to be using a non-system MPI (/usr/local/lib/libmpi.so), but using the system Boost. If you want to keep using your custom MPI, you will need to build Boost from source. Use the CentOS 6 instructions as a guide. Then, specify -D BOOST_ROOT when configuring GridPACK.

Thank you so much, I installed openmpi 2.1.6, ga 5.6.1/ga 5.7/ga 5.8, boost 1.70.0(using the link you provide), petsc 3.7.7, and "make -j 4" is done successfully, but when I execute "make test", there will be two failed items( 55 - petsc_ga_matrix_parallel (Failed) 76 - real_time_path_rating_serial (Timeout)), do I need to install other version of them? Is there any recommendation? Thank you so much.

The following are the error items (refer to the link https://github.com/GridOPTICS/GridPACK/issues/37)

lei@lei-VirtualBox:~/Install_package/GridPACK_without_h/src/build$ ctest -VV -I 55,55
UpdateCTestConfiguration  from :/home/lei/Install_package/GridPACK_without_h/src/build/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/lei/Install_package/GridPACK_without_h/src/build/DartConfiguration.tcl
Test project /home/lei/Install_package/GridPACK_without_h/src/build
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 55
    Start 55: petsc_ga_matrix_parallel

55: Test command: /usr/local/bin/mpiexec "-n" "2" "/home/lei/Install_package/GridPACK_without_h/src/build/math/petsc_ga_matrix_test"
55: Test timeout computed to be: 60
55: 
55: GridPACK math module configured on 2 processors
55: Running 6 test cases...
55: Running 6 test cases...
55: [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
55: [0]PETSC ERROR: Nonconforming object sizes
55: [0]PETSC ERROR: Sum of local lengths 20 does not equal global length 10, my local length 10
55:   likely a call to VecSetSizes() or MatSetSizes() is wrong.
55: See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
55: [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
55: [0]PETSC ERROR: Petsc Release Version 3.7.7, unknown 
55: [0]PETSC ERROR: /home/lei/Install_package/GridPACK_without_h/src/build/math/petsc_ga_matrix_test on a arch-linux2-cxx-debug named lei-VirtualBox by lei Sun May 16 17:51:29 2021
55: [0]PETSC ERROR: Configure options --with-mpi-dir=/usr/local --with-c++-support=1 --with-c-support=0 --with-fortran=0 --with-scalar-type=complex --download-superlu --download-superlu_dist --download-mumps --download-parmetis --download-metis --download-f2cblaslapack=1 --download-suitesparse --with-clanguage=c++ --with-shared-libraries=0 --with-x=0 --with-mpirun=mpirun --with-mpiexec=mpiexec --with-debugging=1 --download-scalapack --with-cxx-dialect=C++11
55: [0]PETSC ERROR: #1 PetscSplitOwnership() line 93 in /home/lei/Install_package/petsc-v3.7.7/src/sys/utils/psplit.c
55: [0]PETSC ERROR: #2 PetscLayoutSetUp() line 143 in /home/lei/Install_package/petsc-v3.7.7/src/vec/is/utils/pmap.c
55: [0]PETSC ERROR: #3 MatMPIDenseSetPreallocation_MPIDense() line 1275 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [0]PETSC ERROR: #4 MatMPIDenseSetPreallocation() line 1419 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [0]PETSC ERROR: #5 MatSetUp_MPIDense() line 999 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [0]PETSC ERROR: #6 MatSetUp() line 739 in /home/lei/Install_package/petsc-v3.7.7/src/mat/interface/matrix.c
55: [0]PETSC ERROR: #7 MatConvert_Shell() line 31 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/shell/shellcnv.c
55: [0]PETSC ERROR: #8 MatConvert() line 3983 in /home/lei/Install_package/petsc-v3.7.7/src/mat/interface/matrix.c
55: [0]PETSC ERROR: #9 test_method() line 129 in /home/lei/Install_package/GridPACK_without_h/src/math/test/petsc_ga_matrix.cpp
55: [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
55: [1]PETSC ERROR: Nonconforming object sizes
55: [1]PETSC ERROR: Sum of local lengths 20 does not equal global length 10, my local length 10
55:   likely a call to VecSetSizes() or MatSetSizes() is wrong.
55: See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
55: [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
55: [1]PETSC ERROR: Petsc Release Version 3.7.7, unknown 
55: [1]PETSC ERROR: /home/lei/Install_package/GridPACK_without_h/src/build/math/petsc_ga_matrix_test on a arch-linux2-cxx-debug named lei-VirtualBox by lei Sun May 16 17:51:29 2021
55: [1]PETSC ERROR: Configure options --with-mpi-dir=/usr/local --with-c++-support=1 --with-c-support=0 --with-fortran=0 --with-scalar-type=complex --download-superlu --download-superlu_dist --download-mumps --download-parmetis --download-metis --download-f2cblaslapack=1 --download-suitesparse --with-clanguage=c++ --with-shared-libraries=0 --with-x=0 --with-mpirun=mpirun --with-mpiexec=mpiexec --with-debugging=1 --download-scalapack --with-cxx-dialect=C++11
55: [1]PETSC ERROR: #1 PetscSplitOwnership() line 93 in /home/lei/Install_package/petsc-v3.7.7/src/sys/utils/psplit.c
55: [1]PETSC ERROR: #2 PetscLayoutSetUp() line 143 in /home/lei/Install_package/petsc-v3.7.7/src/vec/is/utils/pmap.c
55: [1]PETSC ERROR: #3 MatMPIDenseSetPreallocation_MPIDense() line 1275 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [1]PETSC ERROR: #4 MatMPIDenseSetPreallocation() line 1419 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [1]PETSC ERROR: #5 MatSetUp_MPIDense() line 999 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [1]PETSC ERROR: #6 MatSetUp() line 739 in /home/lei/Install_package/petsc-v3.7.7/src/mat/interface/matrix.c
55: [1]PETSC ERROR: #7 MatConvert_Shell() line 31 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/shell/shellcnv.c
55: [1]PETSC ERROR: #8 MatConvert() line 3983 in /home/lei/Install_package/petsc-v3.7.7/src/mat/interface/matrix.c
55: [1]PETSC ERROR: #9 test_method() line 129 in /home/lei/Install_package/GridPACK_without_h/src/math/test/petsc_ga_matrix.cpp
55: unknown location(0): fatal error: in "GAMatrixTest/ConstructConvertDuplicate": std::runtime_error: Error detected in C PETSc
55: /home/lei/Install_package/GridPACK_without_h/src/math/test/petsc_ga_matrix.cpp(122): last checkpoint
55: unknown location(0): fatal error: in "GAMatrixTest/ConstructConvertDuplicate": std::runtime_error: Error detected in C PETSc
55: /home/lei/Install_package/GridPACK_without_h/src/math/test/petsc_ga_matrix.cpp(122): last checkpoint
55: 
55: *** 1 failure is detected in the test module "Master Test Suite"
55: 
55: *** 1 failure is detected in the test module "Master Test Suite"
55: failure detected
55: -------------------------------------------------------
55: Primary job  terminated normally, but 1 process returned
55: a non-zero exit code.. Per user-direction, the job has been aborted.
55: -------------------------------------------------------
55: --------------------------------------------------------------------------
55: mpiexec detected that one or more processes exited with non-zero status, thus causing
55: the job to be terminated. The first process to do so was:
55: 
55:   Process name: [[20759,1],0]
55:   Exit code:    2
55: --------------------------------------------------------------------------
1/1 Test #55: petsc_ga_matrix_parallel .........***Failed  Error regular expression found in output. Regex=[failure detected]  1.10 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =   1.11 sec

The following tests FAILED:
     55 - petsc_ga_matrix_parallel (Failed)
Errors while running CTest
......
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
1/1 Test #76: real_time_path_rating_serial .....***Timeout  60.01 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =  60.10 sec

The following tests FAILED:
     76 - real_time_path_rating_serial (Timeout)
Errors while running CTest
lzheng28 commented 3 years ago

I think you may also run into problems using Open MPI 4.1.0. GridPACK has problems building with versions of PETSc greater than 3.10 that we haven't had an opportunity to fix, but lower versions of PETSc don't compile with Open MPI 4.1.0 due to some functions that are completely deprecated in Open MPI after 4.0. Can you use an earlier version of Open MPI?

I changed the versions from openmpi 4.1.0, petsc 3.11.2 to openmpi 1.8.1, petsc 3.9, but there will be another problem of petsc (I think), the following is the error.

In file included from /home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver.cpp:21:0:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:101:10: error: ‘MatSolverPackage’ does not name a type; did you mean ‘MatSolverType’?
   static MatSolverPackage p_supportedSolverPackage[];
          ^~~~~~~~~~~~~~~~
          MatSolverType
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:107:3: error: ‘MatSolverPackage’ does not name a type; did you mean ‘MatSolverType’?
   MatSolverPackage p_solverPackage;
   ^~~~~~~~~~~~~~~~
   MatSolverType
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp: In constructor ‘gridpack::math::PetscLinearMatrixSolverImplementation<T, I>::PetscLinearMatrixSolverImplementation(const MatrixType&)’:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:56:7: error: class ‘gridpack::math::PetscLinearMatrixSolverImplementation<T, I>’ does not have any field named ‘p_solverPackage’
       p_solverPackage(MATSOLVERSUPERLU_DIST),
       ^~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp: In member function ‘void gridpack::math::PetscLinearMatrixSolverImplementation<T, I>::p_configure(gridpack::utility::Configuration::CursorPtr)’:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:170:21: error: ‘p_supportedSolverPackage’ was not declared in this scope
         if (mstr == p_supportedSolverPackage[i]) {
                     ^~~~~~~~~~~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:170:21: note: suggested alternative: ‘p_nSupportedSolverPackages’
         if (mstr == p_supportedSolverPackage[i]) {
                     ^~~~~~~~~~~~~~~~~~~~~~~~
                     p_nSupportedSolverPackages
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:171:11: error: ‘p_solverPackage’ was not declared in this scope
           p_solverPackage = p_supportedSolverPackage[i];
           ^~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp: In member function ‘void gridpack::math::PetscLinearMatrixSolverImplementation<T, I>::p_factor() const’:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:227:31: error: ‘p_solverPackage’ was not declared in this scope
       ierr = MatGetFactor(*A, p_solverPackage, p_factorType, &p_Fmat);CHKERRXX(ierr);
                               ^~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp: At global scope:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:300:1: error: ‘MatSolverPackage’ does not name a type; did you mean ‘MatSolverType’?
 MatSolverPackage
 ^~~~~~~~~~~~~~~~
 MatSolverType
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:306:2: warning: extra ‘;’ [-Wpedantic]
 };
  ^
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:311:10: error: ‘p_supportedSolverPackage’ was not declared in this scope
   sizeof(p_supportedSolverPackage)/sizeof(MatSolverPackage);
          ^~~~~~~~~~~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:311:10: note: suggested alternative: ‘p_nSupportedSolverPackages’
   sizeof(p_supportedSolverPackage)/sizeof(MatSolverPackage);
          ^~~~~~~~~~~~~~~~~~~~~~~~
          p_nSupportedSolverPackages
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:311:43: error: ‘MatSolverPackage’ was not declared in this scope
   sizeof(p_supportedSolverPackage)/sizeof(MatSolverPackage);
                                           ^~~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:311:43: note: suggested alternative: ‘MatSolverType’
   sizeof(p_supportedSolverPackage)/sizeof(MatSolverPackage);
                                           ^~~~~~~~~~~~~~~~
                                           MatSolverType

I finally choose openmpi 2.1.6, boost 1.70.0, ga 5.6.1/ga 5.7/ga 5.8, petsc 3.7.7, and it can pass "make -j 4" but failed one item (55 - petsc_ga_matrix_parallel (Failed) 76 - real_time_path_rating_serial (Timeout)) in "make test", which I posted on the last reply, I think it's the problem of ga and petsc, do you have any recommendation on the versions of them, thank you so much.

wperkins commented 9 months ago

Looks like this is resolved.