ohpc v2 includes an openmpi built against ucx, but whether intel mpi needs a different ucx isn't quite clear.
IMPI needs mellanox ucx v1.4+, Will checked package names and ucx from system and mellanox repos appears to be the same. So maybe install that at least on v1 systems?
For intel mpi should install slurm-libpmi-ohpc which allows 'I_MPI_PMI_LIBRARY=/lib64/libpmi.so`. See slurm mpi page.
Install the mpi performance metapackage by default so we get a testable cluster (i.e. compilers, mpi, imb).
Should also consider adding default MPI etc in slurm conf when we've done this. Or maybe even modifying modules??
Things to consider adding:
slurm-libpmi-ohpc
which allows 'I_MPI_PMI_LIBRARY=/lib64/libpmi.so`. See slurm mpi page.Should also consider adding default MPI etc in slurm conf when we've done this. Or maybe even modifying modules??
See: