Closed eschnett closed 2 years ago
MPItrampoline is close to OpenMPI, so this could work.
I found a bug in MPItrampoline (MPI_WIN_*
constants were not wrapped) and a few issues in Score-P. I'm correcting MPItrampoline as we speak, and I'll attach a patch for Score-P that should make it compile later.
This patch makes Score-P build with MPItrampoline. Score-P uses MPI constants as case expressions in C, which is not allowed by the MPI standard (e.g. MPI standard 3.1, section 2.5.4). It uses an MPI constant in another place to initialize a global variable in C. This currently works in C++, but not in C; I have added a work-around as well.
@eschnett Yes, that's now building for me. I ran the jacobi tests for C/C++ and then seem to run without issue (also when using MPIwrapper to override the default MPI)
@eschnett It looks like mpi4py
is another one with similar issues to Score-P:
building 'mpi4py.MPI' extension
/project/def-sponsor00/easybuild/software/MPItrampoline/3.5.1-GCC-10.3.0/bin/mpicc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -O2 -ftree-vectorize -march=native -fno-math-errno -fPIC -O2 -ftree-vectorize -march=native -fno-math-errno -fPIC -O1 -march=native -fno-math-errno -fPIC -I/project/def-sponsor00/easybuild/software/FFTW/3.3.9-gmpit-2021a/include -I/cvmfs/pilot.eessi-hpc.org/versions/2021.12/software/linux/x86_64/amd/zen2/software/FlexiBLAS/3.0.4-GCC-10.3.0/include -I/cvmfs/pilot.eessi-hpc.org/versions/2021.12/software/linux/x86_64/amd/zen2/software/FlexiBLAS/3.0.4-GCC-10.3.0/include/flexiblas -fPIC -DHAVE_CONFIG_H=1 -DHAVE_DLFCN_H=1 -DHAVE_DLOPEN=1 -I/cvmfs/pilot.eessi-hpc.org/versions/2021.12/software/linux/x86_64/amd/zen2/software/Python/3.9.5-GCCcore-10.3.0/include/python3.9 -c src/MPI.c -o build/temp.linux-x86_64-3.9/src/MPI.o
In file included from src/atimport.h:17,
from src/mpi4py.MPI.c:597,
from src/MPI.c:4:
src/pympicommctx.h:26:35: error: initializer element is not constant
26 | static int PyMPI_Commctx_KEYVAL = MPI_KEYVAL_INVALID;
| ^~~~~~~~~~~~~~~~~~
mpi4py
has special code for each MPI implementation it supports. This patch for mpi4py
enables building with MPItrampoline for me.
I also needed to add several accidentally omitted MPI constants to MPItrampoline.
mpi4py
built fine with the patch, thanks. I found another issue with CP2K (a Fortran code), but this is separate to the PR, so I will open an issue.
@ocaisa I'm trying to integrate the patch provided by @eschnett into Score-P. While the switch labels are straight forward, I don't see the __constructor__
approach the way to go for Score-P. This patch to Score-P 7.1 provides a different approach. Would it be possible for you to test if it works? Thanks
@cfeld Thank you for looking into this. While I haven't tested the patch, it looks correct. (I notice that it also turns id_root_type
into a private variable.)
id_root_type
is made private as it is used in a single compilation unit only. Instead of initializing it via __constructor__
, I do the initialization early in the Score-P initialization process. I have no idea how this interferes with MPItramoline, though.
The new patch compiles successfully, I also did a quick runtime check (jacobi C++) and that also seems to work.
@cfeld The problem is that MPItrampoline initializes the handles at link time (once it knows which MPI library to actually use), thus initialization needs to be delayed. Your choice seems optimal for this.
OK, thanks for the explanation. We will patch Score-P accordingly. Fix will be available in Score-P 8.0.
I've tested this to compile Score-P and I currently get the following compilation error:
Score-P seems to explicitly want to know the MPI library, so I'm not sure what impact that might have (I faked it by telling it was Open MPI, that seemed to do the trick, the configure results look ok to me).