Closed Hostudent closed 3 years ago
@Hostudent Please look in your BUILD_DIRECTORY/Testing/Temporary/LastTest.log for error messages that the failing tests may have produced. With just the list of test failures, it is not possible for us to determine what is going wrong in your environment.
Note that I've not had good experiences with OpenMPI v3; we've had better experiences with OpenMPI v2.
@Hostudent Are you able to build and run an MPI program, with the MPI installation that you used to build Trilinos? Given that all the tests failed, it looks like MPI itself is not running correctly.
@mhoemmen yes, i have been using LAMMPS in MPI mode. if you check both test's failed list, by using mpich (2nd list) the Zoltan/MPI tests are completely passed but by openmpi (1st list) some of Zoltan/MPI test are failed.
@kddevin Can you please tell me which vesions of the dependencies are the most stable? openmpi,netcdf,boost,gcc,hdf5,curl,... and im using gcc8, but in installation of netcdf encountered with this warning: "Pouring netcdf-4.6.1_4.mojave.bottle.tar.gz Warning: netcdf dependency gcc was built with a different C++ standard library (libstdc++ from clang). This may cause problems at runtime." whats that mean?
Teuchos::GlobalMPISession::GlobalMPISession(): started processor with name Hoseins-Mac.local and rank 0!
*** Unit test suite ...
Sorting tests by group name then by the order they were added ... (time = 1.29e-05)
Running unit tests ...
Int_BadAssignment_UnitTest ... i2 = 5 == i1 = 4 : FAILED ==> /Users/test/desktop/trilinos/packages/teuchos/core/test/UnitTest/Failing_UnitTest.cpp:54 [FAILED] (1.12e-05 sec) Int_BadAssignment_UnitTest Location: /Users/test/desktop/trilinos/packages/teuchos/core/test/UnitTest/Failing_UnitTest.cpp:50
VectorInt_OutOfRangeAt_UnitTest ...
p=0: *** Caught standard std::exception of type 'std::out_of_range' :
vector [FAILED] (0.000582 sec) VectorInt_OutOfRangeAt_UnitTest Location: /Users/test/desktop/trilinos/packages/teuchos/core/test/UnitTest/Failing_UnitTest.cpp:58
The following tests FAILED:
Total Time: 0.000972 sec
Summary: total = 5, run = 5, passed = 3, failed = 2
mpiexec detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was:
The Zoltan test failures appear to be a problem with your system's configuration.
Either request fewer slots for your application, or make more slots available
for use.
This error message seems to indicate your system doesn't support the number of MPI processes requested for these tests. The full Zoltan test suite requires up to 11 MPI processes.
This issue has had no activity for 365 days and is marked for closure. It will be closed after an additional 30 days of inactivity.
If you would like to keep this issue open please add a comment and/or remove the MARKED_FOR_CLOSURE
label.
If this issue should be kept open even with no activity beyond the time limits you can add the label DO_NOT_AUTOCLOSE
.
If it is ok for this issue to be closed, feel free to go ahead and close it. Please do not add any comments or change any labels or otherwise touch this issue unless your intention is to reset the inactivity counter for an additional year.
This issue was closed due to inactivity for 395 days.
At first, I've to say that I'm really amateur in Unix systems and I'm trying to install trilinos(in order to install peridigm) on my MacBook (OS updated to Mojave) for last 5 days! after all, I could reduce numbers of failed tests to 25, as below. can anyone help me to fix them, please??
The following tests FAILED: 113 - TeuchosComm_Time_test_MPI_1 (Failed) 465 - Zoltan_ch_brack2_3_zoltan_parallel (Failed) 467 - Zoltan_ch_degenerate_zoltan_parallel (Failed) 468 - Zoltan_ch_degenerateAA_zoltan_parallel (Failed) 471 - Zoltan_ch_grid20x19_zoltan_parallel (Failed) 472 - Zoltan_ch_hammond_zoltan_parallel (Failed) 473 - Zoltan_ch_hammond2_zoltan_parallel (Failed) 474 - Zoltan_ch_hughes_zoltan_parallel (Failed) 482 - Zoltan_hg_cage10_zoltan_parallel (Failed) 485 - Zoltan_hg_felix_zoltan_parallel (Failed) 486 - Zoltan_hg_ibm03_zoltan_parallel (Failed) 500 - TpetraCore_gemv_MPI_1 (Failed) 501 - TpetraCore_gemm_m_eq_1_MPI_1 (Failed) 502 - TpetraCore_gemm_m_eq_2_MPI_1 (Failed) 503 - TpetraCore_gemm_m_eq_5_MPI_1 (Failed) 504 - TpetraCore_gemm_m_eq_13_MPI_1 (Failed) 542 - TpetraCore_CrsMatrix_UnitTests_MPI_4 (Failed) 543 - TpetraCore_CrsMatrix_UnitTests2_MPI_4 (Failed) 544 - TpetraCore_CrsMatrix_UnitTests3_MPI_4 (Failed) 545 - TpetraCore_CrsMatrix_UnitTests4_MPI_4 (Failed) 550 - TpetraCore_CrsMatrix_ReplaceDomainMapAndImporter_MPI_4 (Failed) 553 - TpetraCore_CrsMatrix_gaussSeidel_MPI_4 (Failed) 904 - SEACASExodus_exodus_unit_tests_nc5_env (Failed) 1022 - Teko_testdriver_MPI_1 (Failed) 1037 - Teko_ModALPreconditioner_MPI_1 (Failed)
I think that the problem is with open-mpi because before open-mpi, i used mpich and the failed tests were as below:
The following tests FAILED: 113 - TeuchosComm_Time_test_MPI_1 (Failed) 409 - Epetra_BlockMap_test_MPI_4 (Failed) 428 - Epetra_IntSerialDense_test_MPI_1 (Failed) 429 - Epetra_Map_test_MPI_1 (Failed) 440 - Epetra_SimpleLongLongTest_MPI_4 (Failed) 441 - Epetra_BlockMap_test_LL_MPI_4 (Failed) 454 - Epetra_Map_test_LL_MPI_1 (Failed) 500 - TpetraCore_gemv_MPI_1 (Failed) 501 - TpetraCore_gemm_m_eq_1_MPI_1 (Failed) 502 - TpetraCore_gemm_m_eq_2_MPI_1 (Failed) 503 - TpetraCore_gemm_m_eq_5_MPI_1 (Failed) 504 - TpetraCore_gemm_m_eq_13_MPI_1 (Failed) 534 - TpetraCore_CrsGraph_UnitTests0_MPI_4 (Failed) 535 - TpetraCore_CrsGraph_UnitTests1_MPI_4 (Failed) 539 - TpetraCore_CrsGraph_PackUnpack_MPI_1_MPI_1 (Failed) 542 - TpetraCore_CrsMatrix_UnitTests_MPI_4 (Failed) 543 - TpetraCore_CrsMatrix_UnitTests2_MPI_4 (Failed) 544 - TpetraCore_CrsMatrix_UnitTests3_MPI_4 (Failed) 545 - TpetraCore_CrsMatrix_UnitTests4_MPI_4 (Failed) 549 - TpetraCore_CrsMatrix_WithGraph_Serial_MPI_4 (Failed) 550 - TpetraCore_CrsMatrix_ReplaceDomainMapAndImporter_MPI_4 (Failed) 553 - TpetraCore_CrsMatrix_gaussSeidel_MPI_4 (Failed) 559 - TpetraCore_CrsMatrix_MultipleFillCompletes_MPI_4 (Failed) 563 - TpetraCore_CrsMatrix_PackUnpack_MPI_1_MPI_1 (Failed) 567 - TpetraCore_Directory_UnitTests_MPI_4 (Failed) 627 - TpetraCore_MultiVector_UnitTests_MPI_4 (Failed) 661 - EpetraExt_inout_hdf5_test_MPI_4 (Failed) 812 - ML_SelfSmoother_MPI_4 (Failed) 904 - SEACASExodus_exodus_unit_tests_nc5_env (Failed) 940 - Anasazi_Epetra_BlockDavidson_solvertest_MPI_4 (Failed) 951 - Anasazi_Epetra_BKS_solvertest_MPI_4 (Failed) 958 - Anasazi_Epetra_GeneralizedDavidson_solvertest_MPI_4 (Failed) 961 - Anasazi_Epetra_LOBPCG_solvertest_MPI_4 (Failed) 978 - Anasazi_SortManager_test_MPI_4 (Failed) 979 - Anasazi_StatusTest_test_MPI_4 (Failed) 1005 - Stratimikos_test_single_belos_thyra_solver_driver_FourByFour_MPI_1 (Failed) 1006 - Stratimikos_test_single_belos_thyra_solver_driver_nos5_kl190_MPI_1 (Failed) 1007 - Stratimikos_test_single_belos_thyra_solver_driver_nos1_np_MPI_1 (Failed) 1008 - Stratimikos_test_single_belos_thyra_solver_driver_nos1_MPI_1 (Failed) 1009 - Stratimikos_test_single_belos_thyra_solver_driver_nos1_nrhs8_MPI_1 (Failed) 1018 - Stratimikos_test_single_stratimikos_solver_driver_belos_np_MPI_1 (Failed) 1019 - Stratimikos_test_single_stratimikos_solver_driver_belos_ifpack_MPI_1 (Failed) 1020 - Stratimikos_test_single_stratimikos_solver_driver_belos_ml_MPI_1 (Failed) 1021 - Teko_testdriver_MPI_4 (Failed) 1022 - Teko_testdriver_MPI_1 (Failed) 1023 - Teko_testdriver_tpetra_MPI_4 (Failed) 1024 - Teko_testdriver_tpetra_MPI_1 (Failed) 1025 - Teko_IterativePreconditionerFactory_test_MPI_1 (Failed) 1026 - Teko_LU2x2InverseOp_test_MPI_1 (Failed) 1028 - Teko_RequestInterface_test_MPI_1 (Failed) 1029 - Teko_DiagnosticLinearOp_test_MPI_1 (Failed) 1030 - Teko_DiagonallyScaledPreconditioner_MPI_1 (Failed) 1031 - Teko_InverseFactoryOperator_MPI_1 (Failed) 1034 - Teko_StratimikosFactory_MPI_1 (Failed) 1037 - Teko_ModALPreconditioner_MPI_1 (Failed) 1046 - Intrepid_test_Discretization_Basis_HCURL_HEX_I1_FEM_Test_01_MPI_1 (Failed) 1047 - Intrepid_test_Discretization_Basis_HCURL_HEX_In_FEM_Test_01_MPI_1 (Failed) 1049 - Intrepid_test_Discretization_Basis_HCURL_TET_I1_FEM_Test_01_MPI_1 (Failed) 1050 - Intrepid_test_Discretization_Basis_HCURL_WEDGE_I1_FEM_Test_01_MPI_1 (Failed) 1051 - Intrepid_test_Discretization_Basis_HCURL_TRI_I1_FEM_Test_01_MPI_1 (Failed) 1052 - Intrepid_test_Discretization_Basis_HCURL_QUAD_I1_FEM_Test_01_MPI_1 (Failed) 1053 - Intrepid_test_Discretization_Basis_HCURL_QUAD_In_FEM_Test_01_MPI_1 (Failed) 1054 - Intrepid_test_Discretization_Basis_HDIV_HEX_I1_FEM_Test_01_MPI_1 (Failed) 1055 - Intrepid_test_Discretization_Basis_HDIV_HEX_In_FEM_Test_01_MPI_1 (Failed) 1057 - Intrepid_test_Discretization_Basis_HDIV_TET_I1_FEM_Test_01_MPI_1 (Failed) 1058 - Intrepid_test_Discretization_Basis_HDIV_TRI_I1_FEM_Test_01_MPI_1 (Failed) 1059 - Intrepid_test_Discretization_Basis_HDIV_QUAD_I1_FEM_Test_01_MPI_1 (Failed) 1060 - Intrepid_test_Discretization_Basis_HDIV_QUAD_In_FEM_Test_01_MPI_1 (Failed) 1062 - Intrepid_test_Discretization_Basis_HDIV_WEDGE_I1_FEM_Test_01_MPI_1 (Failed) 1065 - Intrepid_test_Discretization_Basis_HGRAD_LINE_Cn_FEM_Test_01_MPI_1 (Failed) 1069 - Intrepid_test_Discretization_Basis_HGRAD_LINE_Hermite_FEM_Test_01_MPI_1 (Failed) 1071 - Intrepid_test_Discretization_Basis_HGRAD_QUAD_C1_FEM_Test_01_MPI_1 (Failed) 1073 - Intrepid_test_Discretization_Basis_HGRAD_QUAD_C2_FEM_Test_01_MPI_1 (Failed) 1075 - Intrepid_test_Discretization_Basis_HGRAD_QUAD_Cn_FEM_Test_01_MPI_1 (Failed) 1077 - Intrepid_test_Discretization_Basis_HGRAD_TRI_C1_FEM_Test_01_MPI_1 (Failed) 1079 - Intrepid_test_Discretization_Basis_HGRAD_TRI_C2_FEM_Test_01_MPI_1 (Failed) 1081 - Intrepid_test_Discretization_Basis_HGRAD_HEX_C1_FEM_Test_01_MPI_1 (Failed) 1083 - Intrepid_test_Discretization_Basis_HGRAD_HEX_C2_FEM_Test_01_MPI_1 (Failed) 1085 - Intrepid_test_Discretization_Basis_HGRAD_HEX_I2_FEM_Test_01_MPI_1 (Failed) 1087 - Intrepid_test_Discretization_Basis_HGRAD_HEX_Cn_FEM_Test_01_MPI_1 (Failed) 1089 - Intrepid_test_Discretization_Basis_HGRAD_TET_C1_FEM_Test_01_MPI_1 (Failed) 1091 - Intrepid_test_Discretization_Basis_HGRAD_TET_C2_FEM_Test_01_MPI_1 (Failed) 1102 - Intrepid_test_Discretization_Basis_HGRAD_WEDGE_C1_FEM_Test_01_MPI_1 (Failed) 1104 - Intrepid_test_Discretization_Basis_HGRAD_WEDGE_C2_FEM_Test_01_MPI_1 (Failed) 1105 - Intrepid_test_Discretization_Basis_HGRAD_WEDGE_I2_FEM_Test_01_MPI_1 (Failed) 1107 - Intrepid_test_Discretization_Basis_HGRAD_PYR_C1_FEM_Test_01_MPI_1 (Failed) 1109 - Intrepid_test_Discretization_Basis_HGRAD_PYR_I2_FEM_Test_01_MPI_1 (Failed) 1118 - Intrepid_test_Discretization_Integration_Test_01_MPI_1 (Failed) 1154 - Intrepid_test_Shared_IntrepidPolylib_Test_01_MPI_1 (Failed) 1156 - Intrepid_test_Shared_PointTools_Test_01_MPI_1 (Failed) 1240 - NOX_Thyra_JFNK_MPI_1 (Failed) 1272 - NOX_Tpetra_1DFEM_MPI_4 (Failed) 1274 - Rythmos_ConvergenceTestHelpers_UnitTest_MPI_1 (Failed) 1277 - Rythmos_ExplicitRK_UnitTest_MPI_1 (Failed) 1279 - Rythmos_ImplicitRK_UnitTest_MPI_1 (Failed) 1280 - Rythmos_IntegratorBuilder_UnitTest_MPI_1 (Failed) 1281 - Rythmos_InterpolationBuffer_UnitTest_MPI_1 (Failed) 1284 - Rythmos_RKButcherTableau_UnitTest_MPI_1 (Failed) 1285 - Rythmos_SinCosModel_UnitTest_MPI_1 (Failed) 1286 - Rythmos_VanderPolModel_UnitTest_MPI_1 (Failed) 1290 - Rythmos_StepperBuilder_UnitTest_MPI_1 (Failed) 1291 - Rythmos_StepperHelpers_UnitTest_MPI_1 (Failed) 1297 - Rythmos_ForwardEulerStepper_UnitTest_MPI_1 (Failed) 1298 - Rythmos_BackwardEulerStepper_UnitTest_MPI_1 (Failed) 1304 - Rythmos_IntegrationObservers_UnitTest_MPI_1 (Failed) 1385 - Stokhos_TpetraCrsMatrixMPVectorUnitTest_Serial_MPI_4 (Failed) 1390 - Piro_MatrixFreeDecorator_UnitTests_MPI_4 (Failed) 1394 - Piro_AnalysisDriverTpetra_MPI_4 (Failed) 1398 - Piro_RythmosSolver_UnitTests_MPI_4 (Failed)
the installed dependecies on my system: all of them installed by homebrew gcc 8.2.0 cmake 3.12.3 openmpi 3.1.3 boost 1.67.0_1 hdf5 1.10.4 netcdf 4.6.1_4
and i should mention that i set all environment variables as recommended!