DLR-RY / TwoPhaseFlow

GNU General Public License v3.0
77 stars 36 forks source link

Crash issue while using RDF in parallel #41

Open gregadzrodz opened 1 year ago

gregadzrodz commented 1 year ago

The attached simulation fails and gives sigSegv issue. This is related to the RDF code, because if this is changed to gradAlpha, the simulation runs through the crash point easily. No setup in fvSolutution seem to drastically improve the situation. This is most probably related to the way things are parallelized, because running on single core the simulation gets the crash point. Note that there is a FOAM warning at each time step when run in parallel, while this dissapears when running on a single core.

I am attaching the log file, along with my setup. I run on WSL with version 2112, on 18 processes in parallel. Can you tell me if you can replicate the issue?

RDFIssue.zip

gregadzrodz commented 1 year ago

To add. This is related to how decomposePar distributes the cells. Because in single run, no FOAM warning is given, on 2 processors again this FOAM warining is seen, but then running on 4, this dissapears again.

HenningScheufler commented 1 year ago

Are you sure the fluid properties are correct?

gregadzrodz commented 1 year ago

In reality, this is a micrometer size system. In simulation I scale it up a factor of 10^6, in order to avoid making precission error with small volumes (1e-6)^3 m. Bacisally its a conversion to a micrometer system units. Going from m, kg, s, to μm, pg, μs. All the material properties are scaled accordingly and are set up correctly.

HenningScheufler commented 1 year ago

The interface is not perfectly sharp, there are tiny bubble in the fluid.

Nevertheless, that should not happen. But in this stage it very hard to debug. Could you send the case were it crashes in the next time step? So i can debug it faster?

gregadzrodz commented 1 year ago

Here is a case, but much closer to the crash. If you run it, it should crash in few seconds. See the output file. The interesting thing is that if I decompose it again (and I guess the cells will be split differently on the processors) the crash point will move a little. And if I do not run it in parallel, I do not get a crash at all. So it must have something to do with parallization of RDF code. I now attach the case with also added split to 18 processors. Hopefully it is helpful.

CrashCase_CloserToCrash.zip

As I understand you are not able to reproduce the crash?

gregadzrodz commented 1 year ago

Is there a way of setting up isoAdvector to avoid this smear completely? One thing that I find is that if I increase the surfCellTol from 1e-05 to 1e-07, the smearing is reduced, but the before mentioned problem gets worse, crashes happen earlier and more frequently.
The compressibleInterFlow and geometric reconstruction already perform massively better with regards to smearing than compressibleInterFoam. But this is quite a specific use case, interface in a high speed transonic environment, so this smearing is still present and apparently causing issues.

HenningScheufler commented 1 year ago

This is a special use case there appears to be a problem with the mass conservation. (interface cells appear in the middle of the domain) You could try p_rgh add minIter 2; snapTol 1e-6; // snaps values below 1e-6 to 0 and (1-1e-6) to 1. You could also try lowering this value (bit hacky) surfCellTol 1e-6; // nCorrectors 6; <-- better mass convergence (PISO algorithms needs at least one)

you could also try the transonic options (transonic yes in fvSolution/pimple) and the momentuum predictor

another possibility could be that you limit the temperature in the fvOptions

MitchellWhiting commented 1 year ago

Hi, I seem to be running into a similar issue of interface cells in the middle of the domain, see the photo attached. It's an axisymmetric mesh with the satellite bubbles preferring to form close to the axis. The problem seems to get much worse with mesh refinement. I have tried using the recommended settings you gave above, and while it has improved the stability of my case, it hasn't stopped the unwanted formation of these bubbles. Here is a look at my fvSolutions: `alpha.water { nAlphaCorr 4; nAlphaSubCycles 4; cAlpha 1;

    MULESCorr       no;
    nLimiterIter    5;

    solver          smoothSolver;
    smoother        symGaussSeidel;
    tolerance       1e-8;
    relTol          0;
    advectionScheme isoAdvection;
    reconstructionScheme plicRDF; //isoAlpha;
    vof2IsoTol 1e-8;
    surfCellTol 1e-6;
    snapTol     1e-6;
    writeVTK true;
}

psiFinal
{
    solver           PCG;
    preconditioner   DIC;
    tolerance        1e-7;
    relTol           0.00;
}

rhoCpLFinal
{
    solver           diagonal;
    preconditioner   DILU;
    tolerance        1e-7;
    relTol           0.1;
}

rhoCpVFinal
{
    solver           diagonal;
    preconditioner   DILU;
    tolerance        1e-7;
    relTol           0.1;
}

rho
{
    solver          PCG;
    preconditioner  DIC;
    tolerance       1e-7;
    relTol          0.1;
}

rhoFinal
{
    $rho;
    tolerance       1e-7;
    relTol          0;
}

p_rgh
{
    solver           GAMG;
    tolerance        1e-7;
    relTol           0.01;
    smoother         DIC;
    minIter       3;
    maxIter       15;
    nCellsInCoarsestLevel 100;
}

p_rghFinal
{
    $p_rgh;
    tolerance        1e-7;
    relTol           0;
    minIter       3;
    maxIter       20;
}

"(U|h|T.*|k|epsilon|R)"
{
    solver           smoothSolver; //PBiCGStab;
    smoother         symGaussSeidel;
    //preconditioner   DILU;
    tolerance        1e-7;
    relTol           0.0;
    minIter          15;
    maxIter          50;
}

"(U|h|T.*|k|epsilon|R)Final"
{
    $U;
    tolerance        1e-7;
    relTol           0;
    maxIter          50;
}

}

PIMPLE { momentumPredictor no; nCorrectors 6; nNonOrthogonalCorrectors 2; }

relaxationFactors { equations { "h." 1; "U." 1; }`

Could it maybe be due to excessive superheat? At the moment I am initializing a thermal boundary layer that might be a bit excessive and am going to investigate if it has an impact. Any advice would be appreciated, thank you in advance. Satellite_Bubbles

HenningScheufler commented 1 year ago

Is the maxCapillaryNum in the controlDict smaller than 1; maxCapillaryNum 0.5;

HenningScheufler commented 1 year ago

Try:

  1. decrease the tolerance p_rghFinal and p_rgh to 1e-8

{ $p_rgh; tolerance 1e-8; relTol 0; minIter 3; maxIter 20; }

  1. remove the relTol from the dict entries that use the diagonal solver
  2. On a structured grid nNonOrthogonalCorrectors have no impact on the solution
  3. Try with momentumPredictor yes;

use pointCellLeastSquare as gradient scheme

MitchellWhiting commented 1 year ago

Thank you for the quick reply! I will give your suggestions a try.