GEOS-DEV / GEOS

GEOS Simulation Framework
GNU Lesser General Public License v2.1
222 stars 89 forks source link

Some bugs with Metis Partition #1243

Closed Ron-Wang closed 1 year ago

Ron-Wang commented 3 years ago

I find something wrong when I use Metis to partition the mesh. It seems that some nodes are sent for communication, whose ghostRanks are >= 0. The error is shown as: "trying to set ghostRank of non-locally owned index: m_ghostRank[10]=2"

joshua-white commented 3 years ago

@Ron-Wang, could you provide more specifics how to reproduce this error? What problem are you running? Is it an existing XML in the repository? Details regarding the compiler, platform, and run line are also helpful.

Ron-Wang commented 3 years ago

Thanks for your reply! I am running a problem to test Metis. The mesh is 4×4 (as figure 1). model I try to partition it into 4 parts and Metis does that like figure 2. metis_partition However, an error appears that " Writing into the GEOSX mesh data structure ERROR LOCATION: /home/ronwang/GEOSX/GEOSX/src/coreComponents/managers/ObjectManagerBase.hpp:218 ***** Controlling expression (should be false): m_ghostRank[index] >= 0 trying to set ghostRank of non-locally owned index: m_ghostRank[10]=2 " Finally, I find that it mistakes the node which I mark in figure 3, as the local node. ghostRank_error How should I fix it? Thanks a lot!!

joshua-white commented 3 years ago

@Ron-Wang Can you post the exact XML you used? @corbett5 and @AntoineMazuyer, does this look like the previous non-neighbor issue that was raised a few months ago?

corbett5 commented 3 years ago

This error is from some sanity checks I added in while fixing the original bug. I'll take a look once the XML and mesh file is up.

Ron-Wang commented 3 years ago

Thanks for your reply! Here are my XML file and mesh file. I haved used the branch named "wangrui", which is created in about Feb, 2020. Something here is not updated now. cube.zip

Ron-Wang commented 3 years ago

Actually, I used a large mesh for simulation several weeks ago, and I also met this problem if I partition it into 8 parts or more. I have checked the GEOSX code and found the error appears because of "ObjectManagerBase::SetGhostRankForSenders()". The function checks whether m_ghostRank[indexToSend] >=0. However, I don't understand why some m_ghostRank[indexToSend] >=0, which causes this error. Thanks a lot!

corbett5 commented 3 years ago

Thanks! I'll try it out this evening, my guess is that this is fixed in develop but I'll find out.

corbett5 commented 3 years ago

@Ron-Wang sorry for the delay. The problem appears to be fixed in develop, although your XML file is out of date. Below is a modified version of your XML that worked for me although I removed a lot of things that weren't relevant to the problem at hand. I see that the wangrui branch has 81 commits that haven't made it into develop yet, I recommend updating your branch with develop and that should fix the issue.

<Problem>
  <Solvers
    gravityVector="0.0, -9.81, 0.0">
    <Hydrofracture 
      name="hydrofracture" 
      solidSolverName="lagsolve"
      fluidSolverName="SinglePhaseFlow"
      couplingTypeOption="FIM"
      logLevel="0"
      discretization="FE1"
      targetRegions="{Region1}"
      contactRelationName="">
      <NonlinearSolverParameters
        newtonTol="1.0e-5"
        newtonMaxIter="50"
        lineSearchMaxCuts="10"/>
      <LinearSolverParameters
        logLevel="0"
        solverType="direct"/>
    </Hydrofracture>

    <SolidMechanics_LagrangianFEM name="lagsolve" 
                                  timeIntegrationOption="ExplicitDynamic"
                                  cflFactor="0.001"
                                  discretization="FE1"
                                  targetRegions="{Region1}"
                                  solidMaterialNames="{soil}"
                                  massDamping = "0.0015"
                                  />

    <SinglePhaseFVM
      name="SinglePhaseFlow"
      logLevel="0"
      discretization="singlePhaseTPFA"
      targetRegions="{Region1}"
      fluidNames="{ water }"
      solidNames="{ soil }">
    </SinglePhaseFVM>

  </Solvers>

  <Mesh>
    <PAMELAMeshGenerator name="CubeMesh"
            file="cube.msh"/>
  </Mesh>

  <Events maxTime="50">
    <PeriodicEvent name="solverApplications" 
                   target="/Solvers/hydrofracture" />
  </Events>

  <NumericalMethods>

    <FiniteElements>
        <FiniteElementSpace name="FE1" order="1"/>
    </FiniteElements>

    <FiniteVolume>
      <TwoPointFluxApproximation name="singlePhaseTPFA"
                                 fieldName="pressure"
                                 coefficientName="permeability"/>
    </FiniteVolume>

  </NumericalMethods>

  <ElementRegions>
    <CellElementRegion name="Region1" materialList="{water, soil}" />
  </ElementRegions>

<Constitutive>

    <CompressibleSinglePhaseFluid name="water"
                                  defaultDensity="1000"
                                  defaultViscosity="0.001"
                                  referencePressure="0"
                                  referenceDensity="1000"
                                  compressibility="4.5e-10"
                                  referenceViscosity="0.001"
                                  viscosibility="0.0"/>

  <PoroLinearElasticIsotropic   name="soil"
                    defaultDensity="2700"
                                    defaultBulkModulus="4e6"
                                    defaultShearModulus="0.4e6"
                                    BiotCoefficient="1"
                                    referencePressure="0"/>

</Constitutive>

</Problem>
david4utd commented 3 years ago

@corbett5 Thank you Ben, I think it is fixed in develop. The wangrui branch is branched off the singlePhaseFlowExplicit branch of @huang40 , which has not been merged with develop. We have added a plasticity constitutive model in the branch. The goal is to work on seismic soil-water coupling problems, which works with the singlePhaseFlowExplicit solver and the explicit coupling scheme that is not in the develop branch. We will have to see if we can update these into develop, and also see if the implicit coupling in develop works for us.

joshua-white commented 3 years ago

@david4utd Out of curiosity, what sort of plasticity model have you implemented? We have some models being implemented right now in other branches (notably Drucker-Prager, Cam-Clay, and Delft Egg).

david4utd commented 3 years ago

@joshua-white Its a plasticity model for analyzing soil liquefaction. We are mainly using it for seismic analysis. Here is a link to the paper: https://www.sciencedirect.com/science/article/pii/S0266352X14000378. Implementation of the model in GEOSX is very smooth. We need dynamic solid-fluid coupling, which works with the previous singlePhaseFlowExplicit solver. I see that there is an implicity coupled solver in develop now, I'm not sure if it works for dynamic problems?

Ron-Wang commented 3 years ago

@corbett5 Thanks a lot! I tried to install GEOSX of the lastest version, but I met some problems. At the beginning, it asked me to install HDF5 and I did it. Now, it asks me to install conduit as _CMake Error at cmake/thirdparty/SetupGeosxThirdParty.cmake:141 (message): GEOSX requires conduit, set CONDUITDIR to the conduit installation directory. However, I can't find conduit on the Internet. I use the operation system of Ubuntu 16.04. I am not sure that if I need some permissions or new operation system.

corbett5 commented 3 years ago

@Ron-Wang you'll need to build our third-party libraries, most of which do not exist as standard Linux packages. See https://geosx-geosx.readthedocs-hosted.com/en/latest/docs/sphinx/buildGuide/Index.html

You should have had to do this for your previous branch as well, although you'll need to rebuild them for develop.

klevzoff commented 3 years ago

@Ron-Wang @corbett5 this may also be due to the latest changes in our CMake files. My nightly builds recently broke because I used to just set GEOSX_TPL_ROOT_DIR in host-configs, but that does not work anymore. My understanding is now one must either set each TPL variable (like CONDUIT_DIR) separately, or set GEOSX_TPL_DIR and include tpls.cmake in the host-configs (this is what I did). If you have any local host-config files on your system, they may need to be updated.

Ron-Wang commented 3 years ago

@klevzoff @corbett5 Thanks for your reply! I will try to build the new third-party libraries.

Ron-Wang commented 3 years ago

@corbett5 Thanks a lot for your help! I can run the newest GEOSX in my device. I wonder if the HydrofractureSolver or PoroelasticSolver can solve the problem of dynamic porous materiel (For example, we want to use GEOSX to do some simulations of sand liquefaction in the earthquake). I may also need some time to understand the new rules of GEOSX.

corbett5 commented 3 years ago

@Ron-Wang awesome! I don't know much about the capabilities of those solvers, but @joshua-white might, or at least he could refer you to someone else.

joshua-white commented 3 years ago

@Ron-Wang Both solvers assume quasi-static mechanics, with no inertial terms. You would have to extend the poro-elastic solver a bit to include some additional terms. The dynamic solid mechanics equations (with no pore fluid) are in place (with a Newmark time integration) so it would mainly have to focus on the flow and poroelastic coupling terms. We could discuss further if interested.

Ron-Wang commented 3 years ago

@joshua-white Thank you a lot! Now I am trying to finish the new Explicit Coupled Flow Mechanics Solver from the branch of feature/huang40/ExplicitCoupledFlowMechanicsSolver. Actually, it extends the HydrofractureSolver for the CouplingTypeOption of "FEM" ( FIM, SIM_FixedStress, FEM), and extends the FlowSolverBase for the TimeIntegrationOption of "ExplicitTransient".

TotoGaz commented 1 year ago

This looks like this was fixed + it relates to PAMELA which is not longer used.