SimFlowCFD / RapidCFD-dev

RapidCFD is an OpenFOAM fork running fully on CUDA platform. Brought to you by
https://sim-flow.com
Other
325 stars 94 forks source link

pimpleDyMFoam // LHS and RHS of + have different dimensions #76

Closed Daniel-Molinero closed 4 years ago

Daniel-Molinero commented 4 years ago

Hi, I have tried to run pimpleDyMFoam without success. The case consist of 4 regions (independent meshes) merged and "coupled" with cyclicAMI nonOrdering option (6 patches, 3 interfaces). From inlet to outlet the first two regions are part of a "MRF_static zone", the next region is a "MRF_moving zone" and the last region is part of the "MRF_static zone" (spiral case, distributor, runner and draft tube). MRF approach is not used, just did not want to rename zones. The total mesh size is almost 23,000,000 cells.

I have not used stictchMesh since the original meshes only have equal dimensions at interface patches and are not conformal to each other. This creates skew faces, even when the cells are hexahedrons.

The "relevant" set up files are as follows:

dynamicMeshDict.log

fvSolution.log

controlDict.log

p, U, k, omega files were only modified according the phenomena.

Based on https://openfoamwiki.net/index.php/OpenFOAM_guide/The_PIMPLE_algorithm_in_OpenFOAM and https://www.researchgate.net/publication/307546712_Mathematics_Numerics_Derivations_and_OpenFOAMR.

After typing mpirun -np 4 pimpleDyMFoam -parallel -devices "(0 1 2 3)" 2>&1 | tee run.log in terminal I get the following error:

run.log

The answers after googling "LHS and RHS of + have different dimensions site:www.cfd-online.com" are related to wrong units in 0/ folder for initial conditions, but as mentioned above, only values for variables were modified.

I tried to run pimpleFoam without dynamic mesh (no rotating motion, which I do not need) just to check if the controlDict and everything else works. It runs without complaining!!

By the way, I tested the AMI interfaces with simpleFoam and the MRF aproach. The case has not converged to desired residual yet, but it looks like it is working transferring motion between regions across AMI patches.

simpleFoam_MRF.zip

So, I have run out my knowledge and I would like to know if is this a setup error or something related to RapidCFD implementation?

Best Regards Daniel Molinero

TonkomoLLC commented 4 years ago

Hello, Daniel,

I've used interDyMFoam with RapidCFD and a moving grid, but not pimpleDyMFoam. I think in principle your case should run, but I am not 100% sure.

Therefore, unless someone writes with the knowledge and the direct answer to your problem, you need to start working backwards to find the root cause, as you've been doing.

The test with simpleFoam looks good so far. Is this with RapidCFD?

With respect to other possible steps, yes, like you said, the error could occur due to units in the "0" directory. It could also occur to wrong units in the constant/transportProperties file, like the units described in this file.

The next step I would take is to try to run your case with OpenFOAM 2.3.x pimpleDyMFoam, on the CPU (no GPU). If your case has a problem with something in the 0 or transportProperties file, then the case will fail almost certainly fail with OpenFOAM 2.3.x. However, if the case works with OpenFOAM 2.3.x, but fails with PimpleDyMFoam, then this points to RapidCFD. Except for a few solver choices in fvOptions (e.g., no AINV in OF2.3.x), RapidCFD cases should be interchangeable with OpenFOAM 2.3.x. If the case fails with OpenFOAM 2.3.x hopefully you can fix the units and get the case working.

Hope this helps you out.

Best regards,

Eric

Daniel-Molinero commented 4 years ago

Hi Eric,

Thanks for your fast replay! Answering your question, the test with simpleFoam+MRF was run in RCFD.

I have checked the units in 0 and constant directory. They seem correct to me for incompressible case:

U.log

p.log

omega.log

nut.log

k.log

transportProperties.log

After you mentioned the solvers (AINV, etc), a flashback came to my mind about reading somewhere that GAMG solver does not work well with AMI+dynamic meshes, but I am not sure this actually happened or just wondering. I will run the case again in RCFD with PCG/PBiCG + diagonal to verify. After all I am solving p with GAMG and the simulation crash just after solving U.

Regarding testing the case in OF2.3.x, its a good idea, maybe I build a smaller and simplified case.

Best Regards Daniel

TonkomoLLC commented 4 years ago

Hi, Daniel,

Yeah, I agree the units look fine (for an incompressible case) in the 0 and constant directory.

On the 2.3.x case, your case fails on the first time step, so you can make the tradeoff between making a new case and just running the present 23 million cell case through one time step.

Checking PCG+diagonal is a good idea, too.

Good luck --

Cheers,

Eric

Daniel-Molinero commented 4 years ago

Hi,

Just to to let you know I have found the cause of the problem. This is awkward, but all was result of a silly (I think common) mistake. I did not updated the fvSchemes file in system dictionary from steady state to transient. Now file is corrected and pimpleDyMFoam is running without any warning/issue.

fvSchemes.log

Also I want to confirm, there is no problem using GAMG solver + AMI interfaces + dynamicMesh motion + pimpleDyMFoam application, at least to run the case.

run.log

Best Regards Daniel Molinero

TonkomoLLC commented 4 years ago

Thanks for writing, and glad you found the problem, Daniel. Good luck with your calculations. Best regards, Eric