unicfdlab / hybridCentralSolvers

United collection of hybrid Central solvers - one-phase, two-phase and multicomponent versions
GNU General Public License v3.0
95 stars 59 forks source link

Some error when running parallel on OpenFOAM-v2212 #44

Open 16-1895 opened 1 year ago

16-1895 commented 1 year ago

Hello, I ran a high-speed combustion case in parallel successfully with reactingPimpleCentralFoam on OpenFOAMv1912. Now I need to run it on v2212 for further work. I get an error. ( Error in `reactingPimpleCentralFoam': malloc(): memory corruption: 0x0000000007cece30 ). When I run it with a serial calculation, there is no error. I also tested it with reactingFoam(v2212) and pimpleCentralFoam(v2212) in parallel, there is no error. Do you know how to solve it? Here is my case with the error log. case.zip

16-1895 commented 1 year ago

I also tested Tutorials/shockTubeTwoGases case (v2212). The serial calculation ran well, but the parallel calculation crashed midway. (I only changed the numberOfSubdomains) logerror.txt I am confused now. Can anyone help me? Thanks in advance!

mkraposhin commented 1 year ago

Hi, thank you. I'll check source code. That seems like a strange behaviour. We checked parallel runs many times.

mkraposhin commented 1 year ago

I also tested Tutorials/shockTubeTwoGases case (v2212). The serial calculation ran well, but the parallel calculation crashed midway. (I only changed the numberOfSubdomains) logerror.txt I am confused now. Can anyone help me? Thanks in advance!

Looks very similar to a numerical instability

mkraposhin commented 1 year ago

Hello, I ran a high-speed combustion case in parallel successfully with reactingPimpleCentralFoam on OpenFOAMv1912. Now I need to run it on v2212 for further work. I get an error. ( Error in `reactingPimpleCentralFoam': malloc(): memory corruption: 0x0000000007cece30 ). When I run it with a serial calculation, there is no error. I also tested it with reactingFoam(v2212) and pimpleCentralFoam(v2212) in parallel, there is no error. Do you know how to solve it? Here is my case with the error log. case.zip

Can you try your case with OpenFOAM-2112? It looks like this version works OK and changes are not significant comparing to 2212. I think the problem is with OF.

mkraposhin commented 1 year ago

The problem comes from this part of the code (YEqn.H, lines 224-240):

forAll(maxDeltaY.boundaryField(), iPatch) { if (maxDeltaY.boundaryField()[iPatch].coupled()) { scalarField intF = maxDeltaY.boundaryField()[iPatch].primitiveField(); scalarField intH = hLambdaCoeffs.boundaryField()[iPatch]; const scalarField& intL = lambdaCoeffs.boundaryField()[iPatch].primitiveField(); forAll(intF, iFace) { if (intF[iFace] > 0.05) { intH[iFace] = intL[iFace]; } } hLambdaCoeffs.boundaryFieldRef()[iPatch].operator = (intH); } }

Try to comment it and let me know if this helps

16-1895 commented 1 year ago

The problem comes from this part of the code (YEqn.H, lines 224-240):

forAll(maxDeltaY.boundaryField(), iPatch) { if (maxDeltaY.boundaryField()[iPatch].coupled()) { scalarField intF = maxDeltaY.boundaryField()[iPatch].primitiveField(); scalarField intH = hLambdaCoeffs.boundaryField()[iPatch]; const scalarField& intL = lambdaCoeffs.boundaryField()[iPatch].primitiveField(); forAll(intF, iFace) { if (intF[iFace] > 0.05) { intH[iFace] = intL[iFace]; } } hLambdaCoeffs.boundaryFieldRef()[iPatch].operator = (intH); } }

Try to comment it and let me know if this helps

Hi, it helps. Both my case and shockTubeTwoGases can run steadily in parallel. Is this the final solution?

mkraposhin commented 1 year ago

It looks like something has changed in inter-processor boundaries handling. I think, you can proceed with the curent solution. I'll check what particularly has changed on weekend and then will write here.

16-1895 commented 1 year ago

OK, thanks for your reply.

mkraposhin commented 1 year ago

Hi, I did an amendment, it is available in my repository. I think, @unicfdlab will merge it soon. Thank you for reporting the bug!