Closed lzheng28 closed 9 months ago
You appear to be using a non-system MPI (/usr/local/lib/libmpi.so), but using the system Boost. If you want to keep using your custom MPI, you will need to build Boost from source. Use the CentOS 6 instructions as a guide. Then, specify -D BOOST_ROOT
when configuring GridPACK.
I think you may also run into problems using Open MPI 4.1.0. GridPACK has problems building with versions of PETSc greater than 3.10 that we haven't had an opportunity to fix, but lower versions of PETSc don't compile with Open MPI 4.1.0 due to some functions that are completely deprecated in Open MPI after 4.0. Can you use an earlier version of Open MPI?
You appear to be using a non-system MPI (/usr/local/lib/libmpi.so), but using the system Boost. If you want to keep using your custom MPI, you will need to build Boost from source. Use the CentOS 6 instructions as a guide. Then, specify
-D BOOST_ROOT
when configuring GridPACK.
Thank you so much, I installed openmpi 2.1.6, ga 5.6.1/ga 5.7/ga 5.8, boost 1.70.0(using the link you provide), petsc 3.7.7, and "make -j 4" is done successfully, but when I execute "make test", there will be two failed items( 55 - petsc_ga_matrix_parallel (Failed) 76 - real_time_path_rating_serial (Timeout)), do I need to install other version of them? Is there any recommendation? Thank you so much.
The following are the error items (refer to the link https://github.com/GridOPTICS/GridPACK/issues/37)
lei@lei-VirtualBox:~/Install_package/GridPACK_without_h/src/build$ ctest -VV -I 55,55
UpdateCTestConfiguration from :/home/lei/Install_package/GridPACK_without_h/src/build/DartConfiguration.tcl
UpdateCTestConfiguration from :/home/lei/Install_package/GridPACK_without_h/src/build/DartConfiguration.tcl
Test project /home/lei/Install_package/GridPACK_without_h/src/build
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 55
Start 55: petsc_ga_matrix_parallel
55: Test command: /usr/local/bin/mpiexec "-n" "2" "/home/lei/Install_package/GridPACK_without_h/src/build/math/petsc_ga_matrix_test"
55: Test timeout computed to be: 60
55:
55: GridPACK math module configured on 2 processors
55: Running 6 test cases...
55: Running 6 test cases...
55: [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
55: [0]PETSC ERROR: Nonconforming object sizes
55: [0]PETSC ERROR: Sum of local lengths 20 does not equal global length 10, my local length 10
55: likely a call to VecSetSizes() or MatSetSizes() is wrong.
55: See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
55: [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
55: [0]PETSC ERROR: Petsc Release Version 3.7.7, unknown
55: [0]PETSC ERROR: /home/lei/Install_package/GridPACK_without_h/src/build/math/petsc_ga_matrix_test on a arch-linux2-cxx-debug named lei-VirtualBox by lei Sun May 16 17:51:29 2021
55: [0]PETSC ERROR: Configure options --with-mpi-dir=/usr/local --with-c++-support=1 --with-c-support=0 --with-fortran=0 --with-scalar-type=complex --download-superlu --download-superlu_dist --download-mumps --download-parmetis --download-metis --download-f2cblaslapack=1 --download-suitesparse --with-clanguage=c++ --with-shared-libraries=0 --with-x=0 --with-mpirun=mpirun --with-mpiexec=mpiexec --with-debugging=1 --download-scalapack --with-cxx-dialect=C++11
55: [0]PETSC ERROR: #1 PetscSplitOwnership() line 93 in /home/lei/Install_package/petsc-v3.7.7/src/sys/utils/psplit.c
55: [0]PETSC ERROR: #2 PetscLayoutSetUp() line 143 in /home/lei/Install_package/petsc-v3.7.7/src/vec/is/utils/pmap.c
55: [0]PETSC ERROR: #3 MatMPIDenseSetPreallocation_MPIDense() line 1275 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [0]PETSC ERROR: #4 MatMPIDenseSetPreallocation() line 1419 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [0]PETSC ERROR: #5 MatSetUp_MPIDense() line 999 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [0]PETSC ERROR: #6 MatSetUp() line 739 in /home/lei/Install_package/petsc-v3.7.7/src/mat/interface/matrix.c
55: [0]PETSC ERROR: #7 MatConvert_Shell() line 31 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/shell/shellcnv.c
55: [0]PETSC ERROR: #8 MatConvert() line 3983 in /home/lei/Install_package/petsc-v3.7.7/src/mat/interface/matrix.c
55: [0]PETSC ERROR: #9 test_method() line 129 in /home/lei/Install_package/GridPACK_without_h/src/math/test/petsc_ga_matrix.cpp
55: [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
55: [1]PETSC ERROR: Nonconforming object sizes
55: [1]PETSC ERROR: Sum of local lengths 20 does not equal global length 10, my local length 10
55: likely a call to VecSetSizes() or MatSetSizes() is wrong.
55: See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
55: [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
55: [1]PETSC ERROR: Petsc Release Version 3.7.7, unknown
55: [1]PETSC ERROR: /home/lei/Install_package/GridPACK_without_h/src/build/math/petsc_ga_matrix_test on a arch-linux2-cxx-debug named lei-VirtualBox by lei Sun May 16 17:51:29 2021
55: [1]PETSC ERROR: Configure options --with-mpi-dir=/usr/local --with-c++-support=1 --with-c-support=0 --with-fortran=0 --with-scalar-type=complex --download-superlu --download-superlu_dist --download-mumps --download-parmetis --download-metis --download-f2cblaslapack=1 --download-suitesparse --with-clanguage=c++ --with-shared-libraries=0 --with-x=0 --with-mpirun=mpirun --with-mpiexec=mpiexec --with-debugging=1 --download-scalapack --with-cxx-dialect=C++11
55: [1]PETSC ERROR: #1 PetscSplitOwnership() line 93 in /home/lei/Install_package/petsc-v3.7.7/src/sys/utils/psplit.c
55: [1]PETSC ERROR: #2 PetscLayoutSetUp() line 143 in /home/lei/Install_package/petsc-v3.7.7/src/vec/is/utils/pmap.c
55: [1]PETSC ERROR: #3 MatMPIDenseSetPreallocation_MPIDense() line 1275 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [1]PETSC ERROR: #4 MatMPIDenseSetPreallocation() line 1419 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [1]PETSC ERROR: #5 MatSetUp_MPIDense() line 999 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/dense/mpi/mpidense.c
55: [1]PETSC ERROR: #6 MatSetUp() line 739 in /home/lei/Install_package/petsc-v3.7.7/src/mat/interface/matrix.c
55: [1]PETSC ERROR: #7 MatConvert_Shell() line 31 in /home/lei/Install_package/petsc-v3.7.7/src/mat/impls/shell/shellcnv.c
55: [1]PETSC ERROR: #8 MatConvert() line 3983 in /home/lei/Install_package/petsc-v3.7.7/src/mat/interface/matrix.c
55: [1]PETSC ERROR: #9 test_method() line 129 in /home/lei/Install_package/GridPACK_without_h/src/math/test/petsc_ga_matrix.cpp
55: unknown location(0): fatal error: in "GAMatrixTest/ConstructConvertDuplicate": std::runtime_error: Error detected in C PETSc
55: /home/lei/Install_package/GridPACK_without_h/src/math/test/petsc_ga_matrix.cpp(122): last checkpoint
55: unknown location(0): fatal error: in "GAMatrixTest/ConstructConvertDuplicate": std::runtime_error: Error detected in C PETSc
55: /home/lei/Install_package/GridPACK_without_h/src/math/test/petsc_ga_matrix.cpp(122): last checkpoint
55:
55: *** 1 failure is detected in the test module "Master Test Suite"
55:
55: *** 1 failure is detected in the test module "Master Test Suite"
55: failure detected
55: -------------------------------------------------------
55: Primary job terminated normally, but 1 process returned
55: a non-zero exit code.. Per user-direction, the job has been aborted.
55: -------------------------------------------------------
55: --------------------------------------------------------------------------
55: mpiexec detected that one or more processes exited with non-zero status, thus causing
55: the job to be terminated. The first process to do so was:
55:
55: Process name: [[20759,1],0]
55: Exit code: 2
55: --------------------------------------------------------------------------
1/1 Test #55: petsc_ga_matrix_parallel .........***Failed Error regular expression found in output. Regex=[failure detected] 1.10 sec
0% tests passed, 1 tests failed out of 1
Total Test time (real) = 1.11 sec
The following tests FAILED:
55 - petsc_ga_matrix_parallel (Failed)
Errors while running CTest
......
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
76: 0: PETSc KSP converged after 1 iterations, reason: 4
1/1 Test #76: real_time_path_rating_serial .....***Timeout 60.01 sec
0% tests passed, 1 tests failed out of 1
Total Test time (real) = 60.10 sec
The following tests FAILED:
76 - real_time_path_rating_serial (Timeout)
Errors while running CTest
I think you may also run into problems using Open MPI 4.1.0. GridPACK has problems building with versions of PETSc greater than 3.10 that we haven't had an opportunity to fix, but lower versions of PETSc don't compile with Open MPI 4.1.0 due to some functions that are completely deprecated in Open MPI after 4.0. Can you use an earlier version of Open MPI?
I changed the versions from openmpi 4.1.0, petsc 3.11.2 to openmpi 1.8.1, petsc 3.9, but there will be another problem of petsc (I think), the following is the error.
In file included from /home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver.cpp:21:0:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:101:10: error: ‘MatSolverPackage’ does not name a type; did you mean ‘MatSolverType’?
static MatSolverPackage p_supportedSolverPackage[];
^~~~~~~~~~~~~~~~
MatSolverType
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:107:3: error: ‘MatSolverPackage’ does not name a type; did you mean ‘MatSolverType’?
MatSolverPackage p_solverPackage;
^~~~~~~~~~~~~~~~
MatSolverType
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp: In constructor ‘gridpack::math::PetscLinearMatrixSolverImplementation<T, I>::PetscLinearMatrixSolverImplementation(const MatrixType&)’:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:56:7: error: class ‘gridpack::math::PetscLinearMatrixSolverImplementation<T, I>’ does not have any field named ‘p_solverPackage’
p_solverPackage(MATSOLVERSUPERLU_DIST),
^~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp: In member function ‘void gridpack::math::PetscLinearMatrixSolverImplementation<T, I>::p_configure(gridpack::utility::Configuration::CursorPtr)’:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:170:21: error: ‘p_supportedSolverPackage’ was not declared in this scope
if (mstr == p_supportedSolverPackage[i]) {
^~~~~~~~~~~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:170:21: note: suggested alternative: ‘p_nSupportedSolverPackages’
if (mstr == p_supportedSolverPackage[i]) {
^~~~~~~~~~~~~~~~~~~~~~~~
p_nSupportedSolverPackages
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:171:11: error: ‘p_solverPackage’ was not declared in this scope
p_solverPackage = p_supportedSolverPackage[i];
^~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp: In member function ‘void gridpack::math::PetscLinearMatrixSolverImplementation<T, I>::p_factor() const’:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:227:31: error: ‘p_solverPackage’ was not declared in this scope
ierr = MatGetFactor(*A, p_solverPackage, p_factorType, &p_Fmat);CHKERRXX(ierr);
^~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp: At global scope:
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:300:1: error: ‘MatSolverPackage’ does not name a type; did you mean ‘MatSolverType’?
MatSolverPackage
^~~~~~~~~~~~~~~~
MatSolverType
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:306:2: warning: extra ‘;’ [-Wpedantic]
};
^
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:311:10: error: ‘p_supportedSolverPackage’ was not declared in this scope
sizeof(p_supportedSolverPackage)/sizeof(MatSolverPackage);
^~~~~~~~~~~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:311:10: note: suggested alternative: ‘p_nSupportedSolverPackages’
sizeof(p_supportedSolverPackage)/sizeof(MatSolverPackage);
^~~~~~~~~~~~~~~~~~~~~~~~
p_nSupportedSolverPackages
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:311:43: error: ‘MatSolverPackage’ was not declared in this scope
sizeof(p_supportedSolverPackage)/sizeof(MatSolverPackage);
^~~~~~~~~~~~~~~~
/home/lei/Install_package/GridPACK_lei/src/math/petsc/petsc_linear_matrix_solver_impl.hpp:311:43: note: suggested alternative: ‘MatSolverType’
sizeof(p_supportedSolverPackage)/sizeof(MatSolverPackage);
^~~~~~~~~~~~~~~~
MatSolverType
I finally choose openmpi 2.1.6, boost 1.70.0, ga 5.6.1/ga 5.7/ga 5.8, petsc 3.7.7, and it can pass "make -j 4" but failed one item (55 - petsc_ga_matrix_parallel (Failed) 76 - real_time_path_rating_serial (Timeout)) in "make test", which I posted on the last reply, I think it's the problem of ga and petsc, do you have any recommendation on the versions of them, thank you so much.
Looks like this is resolved.
Hello, I met an error when I make GridPACK. The system I use is Ubuntu 18.04. The following is the error info.
And my cmake .sh is
And after executing it, the info is
I think that means each package is found successfully. So could you please help me to solve this problem? Is there any dependency I didn't install or the version is not fit for GridPACK? Thank you so much.
And I also find a strange thing that I install openmpi v4.1.0, but the info shows that the mpi is v3.1.0.