Closed joneuhauser closed 2 years ago
I am afraid this is not a nekRS specific issue.
I think it crashes because there is no GPU aware MPI available. You can turn it off (e.g. export NEKRS_GPU_MPI=0) but this will introduce a performance regression.
Thanks for the hint! I got it working with CUDA-enabled mvapich 2.3.7. With openmpi 4.1.4 (cuda enabled) NekRS doesn't seem to notice that there are multiple ranks - output looks something like this
but this issue exists also with Nek5000 (a simple MPI hello world shows the correct number of ranks though, and I haven't had that issue with other numerical codes), which is why I was using mpich.
Describe the bug Running the example program (turbPipePeriodic) with NekRS works only without MPI.
To Reproduce
The last output printed to console is
and attaching GDB tells me
The example works fine if I just run
nekrs --setup turbPipe.par
.Expected behavior
The example works.
Version information: