GEOS-DEV / GEOS

GEOS Simulation Framework
GNU Lesser General Public License v2.1
212 stars 85 forks source link

SuiteSparse: umfpack_dl_numeric failed #2315

Closed FishYNY closed 1 year ago

FishYNY commented 1 year ago

I don't kown where is going wrong?

GEOSX version: 0.2.0 (develop, sha1: bbd1948c0)

UMFPACK V5.7.9 (Oct 20, 2019): ERROR: out of memory

ERROR LOCATION: /home/pku/codes/codes02232023/GEOSX/src/coreComponents/linearAlgebra/interfaces/direct/SuiteSparse.cpp:158 Controlling expression (should be false): true Rank 0: SuiteSparse: umfpack_dl_numeric failed.

StackTrace of 12 frames Frame 0: geosx::SuiteSparse::setup(geosx::HypreMatrix const&) Frame 1: geosx::SolverBase::solveLinearSystem(geosx::DofManager const&, geosx::HypreMatrix&, geosx::HypreVector&, geosx::HypreVector&) Frame 2: geosx::SolverBase::solveNonlinearSystem(double const&, double const&, int, geosx::DomainPartition&) Frame 3: geosx::SolverBase::nonlinearImplicitStep(double const&, double const&, int, geosx::DomainPartition&) Frame 4: geosx::SolverBase::solverStep(double const&, double const&, int, geosx::DomainPartition&) Frame 5: geosx::CoupledSolver<geosx::MultiphasePoromechanicsSolver, geosx::CompositionalMultiphaseWell>::solverStep(double const&, double const&, int, geosx::DomainPartition&) Frame 6: geosx::SolverBase::execute(double, double, int, int, double, geosx::DomainPartition&) Frame 7: geosx::EventBase::execute(double, double, int, int, double, geosx::DomainPartition&) Frame 8: geosx::EventManager::run(geosx::DomainPartition&) Frame 9: geosx::GeosxState::run() Frame 10: main Frame 11: __libc_start_main Frame 12: _start


MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them.

basemodel_smoke.zip

francoishamon commented 1 year ago

Hello @FishYNY , Most likely, the error message:

UMFPACK V5.7.9 (Oct 20, 2019): ERROR: out of memory

indicates that your multiphase poromechanics problem is just too large for the direct solver.

I suggest trying with:

      <NonlinearSolverParameters
        newtonTol="1.0e-5"
        lineSearchAction="Attempt"
        newtonMaxIter="40"/>
      <LinearSolverParameters
        solverType="gmres"
        preconditionerType="mgr"/>

(looser nonlinear tolerance + use of the iterative solver)

FishYNY commented 1 year ago

Hello @francoishamon Thank you for your advice. It does work, but convergence is still not achieved. Maybe my problem is definitely large, is there something else I should try?

francoishamon commented 1 year ago

I would suggest the following:

  1. Try initializing the effective stress as done here: https://github.com/GEOSX/GEOSX/blob/5841bf7f79a90368e19e399199891d73fe20ad78/inputFiles/poromechanics/PoroElastic_hybridHexPrism_co2_3d.xml#L192

For you, given your datumPressure, it is going to be something like:

    <FieldSpecification
      name="initialSigmaReservoir_x"
      initialCondition="1"
      setNames="{ all }"
      objectPath="ElementRegions/reservoir"
      fieldName="skeleton_stress"
      component="0"
      scale="1e7"/>
    <FieldSpecification
      name="initialSigmaReservoir_y"
      initialCondition="1"
      setNames="{ all }"
      objectPath="ElementRegions/reservoir"
      fieldName="skeleton_stress"
      component="1"
      scale="1e7"/>
    <FieldSpecification
      name="initialSigmaReservoir_z"
      initialCondition="1"
      setNames="{ all }"
      objectPath="ElementRegions/reservoir"
      fieldName="skeleton_stress"
      component="2"
      scale="1e7"/>

    <FieldSpecification
      name="initialSigmaTop_x"
      initialCondition="1"
      setNames="{ all }"
      objectPath="ElementRegions/top"
      fieldName="skeletonBurden_stress"
      component="0"
      scale="1e7"/>
    <FieldSpecification
      name="initialSigmaTop_y"
      initialCondition="1"
      setNames="{ all }"
      objectPath="ElementRegions/top"
      fieldName="skeletonBurden_stress"
      component="1"
      scale="1e7"/>
    <FieldSpecification
      name="initialSigmaTop_z"
      initialCondition="1"
      setNames="{ all }"
      objectPath="ElementRegions/top"
      fieldName="skeletonBurden_stress"
      component="2"
      scale="1e7"/>

   <FieldSpecification
      name="initialSigmaBottom_x"
      initialCondition="1"
      setNames="{ all }"
      objectPath="ElementRegions/bottom"
      fieldName="skeletonBurden_stress"
      component="0"
      scale="1e7"/>
    <FieldSpecification
      name="initialSigmaBottom_y"
      initialCondition="1"
      setNames="{ all }"
      objectPath="ElementRegions/bottom"
      fieldName="skeletonBurden_stress"
      component="1"
      scale="1e7"/>
    <FieldSpecification
      name="initialSigmaTop_z"
      initialCondition="1"
      setNames="{ all }"
      objectPath="ElementRegions/bottom"
      fieldName="skeletonBurden_stress"
      component="2"
      scale="1e7"/>

then look at the first vtk output before the first time step to make sure that the initial stress is taken into account (alternatively, use the latest commit to develop that will throw an error if the initial stress is not applied because of a naming problem, etc).

  1. Make sure that the first mechanics residual (R_solid) is zero before the first linear iteration of the first time step

  2. If the linear solver does not converge, post the simulation log on the issue

FishYNY commented 1 year ago

yes!It works. Thank you! The "targetTotalRate" I set was not proper.