Closed Alex-jian522 closed 10 months ago
I have tried to decrease my timestep by setting the CFL limit to be 0.1, and I also tried to use a fixed timestep with 1.0e-9 and 1.0e-10 (I think it is small enough). But PeleLMeX reports the same error.
There should be no issues with running autoignition cases in PeleLMeX in general. It would be helpful to see the end of the output that gets printed to stdout and stderr. Based on where the backtrace is failing, I believe you are seeing something that says "deltaT_iters not converged !" printed?
PeleLMeX does a linearized, iterative solve for the nonlinear system in the implicit solve for thermal diffusion. This error appears when that solve does not converge. See Step 2, substep 5 here for more details on the algorithm: https://amrex-combustion.github.io/PeleLMeX/manual/html/Model.html#low-mach-number-projection-scheme
Usually, when this solve fails it is because of discontinuities in the thermodynamic data, specifically species enthalpies, that are part of your mechanism. These are specified through NASA polynomials that transfer between different fits at 1000 K, and sometimes the derivative is discontinuous at this point, which causes the iterative solve to bounce around with a small error rather than converging.
The steps for diagnosing this problem are as follows:
1) Run with peleLM.deltaT_verbose = 1
in your input file. This will print diagnostic output relating to the deltaT iterations in the thermal diffusion solve, which should look something like this:
Iterative solve for deltaT
DeltaT solve norm [0] = 1.91905727e-10
DeltaT solve norm [1] = 1.91889072e-10
DeltaT solve norm [2] = 2.302708684e-13
2) If the diagnostic output indicates that the residual is decreasing on each iteration, but not reaching the target value of 1e-10, you can increase the number of allowed iterations by setting peleLM.deltaT_iterMax = 10
(or more, the default value is 5).
3) If the diagnostic output shows a residual that decreases, then plateaus and starts bouncing around at a small value, but above 1e-10, you can increase the tolerance by setting peleLM.deltaT_tol
to a value higher than the value of the plateau.
4) You can also choose to allow the simulation to continue if deltaT iterations do not converge by setting peleLM.deltaT_crashIfFailing = 0
. This is a more extreme option as it may lead to unexpected behavior if some other problem is causing the deltaT solve to fail, so use with caution.
Thank you very much for your timely reply. Your reply not only solved the problem, but also made it clear to me.
I would like to ask if PeleLMeX can be used to calculate auto-ignition cases. What are the possible meanings and solutions of this error? Thanks for any reply
Host Name: ibnode74 === If no file names and line numbers are shown below, one can run addr2line -Cpfie my_exefile my_line_address to convert
my_line_address
(e.g., 0x4a6b) into file name and line number. Or one can use amrex/Tools/Backtrace/parse_bt.py.=== Please note that the line number reported by addr2line may not be accurate. One can use readelf -wl my_exefile | grep my_line_address' to find out the offset for that line.
0: ./PeleLMeX2d.gnu.MPI.ex() [0xf1c1c6] amrex::BLBackTrace::print_backtrace_info(_IO_FILE*) /share/home/zhengjian/Pele/amrex/Src/Base/AMReX_BLBackTrace.cpp:199:36
1: ./PeleLMeX2d.gnu.MPI.ex() [0xf1daa6] amrex::BLBackTrace::handler(int) /share/home/zhengjian/Pele/amrex/Src/Base/AMReX_BLBackTrace.cpp:99:15
2: ./PeleLMeX2d.gnu.MPI.ex() [0x5cc38e] amrex::Abort(char const*) inlined at /share/home/zhengjian/Pele/PeleLMeX/Source/PeleLMDiffusion.cpp:1148:18 in PeleLM::differentialDiffusionUpdate(std::unique_ptr<PeleLM::AdvanceAdvData, std::default_delete >&, std::unique_ptr<PeleLM::AdvanceDiffData, std::default_delete >&)
/share/home/zhengjian/Pele/amrex/Src/Base/AMReX.H:159:19
PeleLM::differentialDiffusionUpdate(std::unique_ptr<PeleLM::AdvanceAdvData, std::default_delete >&, std::unique_ptr<PeleLM::AdvanceDiffData, std::default_delete >&)
/share/home/zhengjian/Pele/PeleLMeX/Source/PeleLMDiffusion.cpp:1148:18
3: ./PeleLMeX2d.gnu.MPI.ex() [0x57be7c] PeleLM::oneSDC(int, std::unique_ptr<PeleLM::AdvanceAdvData, std::default_delete >&, std::unique_ptr<PeleLM::AdvanceDiffData, std::default_delete >&)
/share/home/zhengjian/Pele/PeleLMeX/Source/PeleLMAdvance.cpp:362:4
4: ./PeleLMeX2d.gnu.MPI.ex() [0x57ed07] PeleLM::Advance(int) /share/home/zhengjian/Pele/PeleLMeX/Source/PeleLMAdvance.cpp:172:7
5: ./PeleLMeX2d.gnu.MPI.ex() [0x5939aa] PeleLM::Evolve() /share/home/zhengjian/Pele/PeleLMeX/Source/PeleLMEvolve.cpp:43:18
6: ./PeleLMeX2d.gnu.MPI.ex() [0x429a3f] main /share/home/zhengjian/Pele/PeleLMeX/Source/main.cpp:57:51
7: /lib64/libc.so.6(__libc_start_main+0xf5) [0x2b630057f555]
8: ./PeleLMeX2d.gnu.MPI.ex() [0x4341ad] _start at ??:?