ec-jrc / pyPoseidon

Framework for Hydrodynamic simulations
https://pyposeidon.readthedocs.io/
European Union Public License 1.2
20 stars 8 forks source link

Make sure tools.create_mpirun_script() works both with mpich & openmpi #41

Closed brey closed 3 years ago

brey commented 3 years ago

Settinguse_threads: True by default works with openmpi but not mpich.

We need either to identify which flavour of mpi is installed or do a get_value() within schism.py so that the user specifies use_threads: False when using mpich.

pmav99 commented 3 years ago

Could you please paste the output of the following command on Mac:

mpirun --version 

If not too much trouble, both for mpich and openmpi.

pmav99 commented 3 years ago

For reference, on linux:

$ /opt/mpich/bin/mpirun --version
HYDRA build details:
    Version:                                 3.4.2
    Release Date:                            Wed May 26 15:51:40 CDT 2021
    CC:                              gcc    -march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions         -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security         -fstack-clash-protection -fcf-protection -Wno-error=array-bounds  
    Configure options:                       '--disable-option-checking' '--prefix=/opt/mpich' '--with-device=ch4:ucx' '--with-hwloc-prefix=system' '--without-java' '--enable-error-checking=runtime' '--enable-error-messages=all' '--enable-g=meminit' 'CC=gcc' 'CXX=g++' 'FC=gfortran' 'FFLAGS=-fallow-argument-mismatch -O2' 'FCFLAGS=-fallow-argument-mismatch -O2' 'MPICHLIB_CFLAGS=-march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -Wno-error=array-bounds' 'MPICHLIB_CPPFLAGS=' 'MPICHLIB_CXXFLAGS=-march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -Wp,-D_GLIBCXX_ASSERTIONS' 'MPICHLIB_FFLAGS=' 'MPICHLIB_FCFLAGS=' '--cache-file=/dev/null' '--srcdir=../../../../src/pm/hydra' 'CFLAGS= -march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -Wno-error=array-bounds -O2' 'LDFLAGS=' 'LIBS=' 'CPPFLAGS= -DNETMOD_INLINE=__netmod_inline_ucx__ -I/home/aur_builder/.cache/yay/mpich/src/mpich-3.4.2/build/src/mpl/include -I/home/aur_builder/.cache/yay/mpich/src/mpich-3.4.2/src/mpl/include -I/home/aur_builder/.cache/yay/mpich/src/mpich-3.4.2/build/modules/yaksa/src/frontend/include -I/home/aur_builder/.cache/yay/mpich/src/mpich-3.4.2/modules/yaksa/src/frontend/include -I/home/aur_builder/.cache/yay/mpich/src/mpich-3.4.2/modules/json-c -I/home/aur_builder/.cache/yay/mpich/src/mpich-3.4.2/build/modules/json-c -D_REENTRANT -I/home/aur_builder/.cache/yay/mpich/src/mpich-3.4.2/build/src/mpi/romio/include -I/home/aur_builder/.cache/yay/mpich/src/mpich-3.4.2/build/modules/ucx/src -I/home/aur_builder/.cache/yay/mpich/src/mpich-3.4.2/modules/ucx/src' 'MPLLIBNAME=mpl'
    Process Manager:                         pmi
    Launchers available:                     ssh rsh fork slurm ll lsf sge manual persist
    Topology libraries available:            hwloc
    Resource management kernels available:   user slurm ll lsf sge pbs cobalt
    Demux engines available:                 poll select

and

$ mpirun --version
mpirun (Open MPI) 4.1.1

Report bugs to http://www.open-mpi.org/community/help/
brey commented 3 years ago

OPENMPI:

❯ conda list | grep mpi
hdf5                      1.10.6          mpi_openmpi_h44f39ad_1011    conda-forge
libnetcdf                 4.7.4           mpi_openmpi_h9bf59f8_7    conda-forge
mpi                       1.0                     openmpi    conda-forge
netcdf-fortran            4.5.3           mpi_openmpi_he825b94_1    conda-forge
netcdf4                   1.5.6           nompi_py39h353b61e_102    conda-forge
openmpi                   4.0.5                h809c96e_4    conda-forge
pschism                   5.8.0           mpi_openmpi_h732f192_0    gbrey
❯ mpirun --version
mpirun (Open MPI) 4.0.5

MPICH:

❯ conda list | grep mpi
# packages in environment at /Users/brey/miniconda3/envs/ptests_mpich:
hdf5                      1.10.6          mpi_mpich_hd7fb7e8_1011    conda-forge
libnetcdf                 4.7.4           mpi_mpich_h2865646_7    conda-forge
mpi                       1.0                       mpich    conda-forge
mpich                     3.3.2                hd33e60e_5    conda-forge
netcdf-fortran            4.5.3           mpi_mpich_h7f55214_1    conda-forge
netcdf4                   1.5.6           nompi_py38h2c97785_102    conda-forge
pschism                   5.8.0           mpi_mpich_h6037171_0    gbrey
❯ mpirun --version
HYDRA build details:
    Version:                                 3.3.2
    Release Date:                            Tue Nov 12 21:23:16 CST 2019
    CC:                              x86_64-apple-darwin13.4.0-clang -I/Users/brey/miniconda3/envs/ptests_mpich/include -I/Users/brey/miniconda3/envs/ptests_mpich/include -L/Users/brey/miniconda3/envs/ptests_mpich/lib -Wl,-rpath,/Users/brey/miniconda3/envs/ptests_mpich/lib 
    CXX:                             x86_64-apple-darwin13.4.0-clang++ -I/Users/brey/miniconda3/envs/ptests_mpich/include -I/Users/brey/miniconda3/envs/ptests_mpich/include -L/Users/brey/miniconda3/envs/ptests_mpich/lib -Wl,-rpath,/Users/brey/miniconda3/envs/ptests_mpich/lib 
    F77:                             x86_64-apple-darwin13.4.0-gfortran -I/Users/brey/miniconda3/envs/ptests_mpich/include -L/Users/brey/miniconda3/envs/ptests_mpich/lib -Wl,-rpath,/Users/brey/miniconda3/envs/ptests_mpich/lib 
    F90:                             x86_64-apple-darwin13.4.0-gfortran -I/Users/brey/miniconda3/envs/ptests_mpich/include -L/Users/brey/miniconda3/envs/ptests_mpich/lib -Wl,-rpath,/Users/brey/miniconda3/envs/ptests_mpich/lib 
    Configure options:                       '--disable-option-checking' '--prefix=/Users/brey/miniconda3/envs/ptests_mpich' '--disable-dependency-tracking' '--enable-cxx' '--enable-fortran' '--disable-wrapper-rpath' 'build_alias=x86_64-apple-darwin13.4.0' 'host_alias=x86_64-apple-darwin13.4.0' 'MPICHLIB_CFLAGS=-march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O2 -pipe -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -fdebug-prefix-map=/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work=/usr/local/src/conda/mpich-3.3.2 -fdebug-prefix-map=/Users/brey/miniconda3/envs/ptests_mpich=/usr/local/src/conda-prefix' 'MPICHLIB_CPPFLAGS=-D_FORTIFY_SOURCE=2 -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -mmacosx-version-min=10.9' 'MPICHLIB_CXXFLAGS=-march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O2 -pipe -stdlib=libc++ -fvisibility-inlines-hidden -std=c++14 -fmessage-length=0 -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -fdebug-prefix-map=/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work=/usr/local/src/conda/mpich-3.3.2 -fdebug-prefix-map=/Users/brey/miniconda3/envs/ptests_mpich=/usr/local/src/conda-prefix' 'MPICHLIB_FFLAGS=-march=core2 -mtune=haswell -ftree-vectorize -fPIC -fstack-protector -O2 -pipe -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -fdebug-prefix-map=/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work=/usr/local/src/conda/mpich-3.3.2 -fdebug-prefix-map=/Users/brey/miniconda3/envs/ptests_mpich=/usr/local/src/conda-prefix' 'MPICHLIB_FCFLAGS=-march=core2 -mtune=haswell -ftree-vectorize -fPIC -fstack-protector -O2 -pipe -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -fdebug-prefix-map=/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work=/usr/local/src/conda/mpich-3.3.2 -fdebug-prefix-map=/Users/brey/miniconda3/envs/ptests_mpich=/usr/local/src/conda-prefix' 'CC=x86_64-apple-darwin13.4.0-clang' 'CFLAGS=-I/Users/brey/miniconda3/envs/ptests_mpich/include -march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O2 -pipe -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -fdebug-prefix-map=/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work=/usr/local/src/conda/mpich-3.3.2 -fdebug-prefix-map=/Users/brey/miniconda3/envs/ptests_mpich=/usr/local/src/conda-prefix -O2' 'LDFLAGS=-L/Users/brey/miniconda3/envs/ptests_mpich/lib -Wl,-rpath,/Users/brey/miniconda3/envs/ptests_mpich/lib' 'CPPFLAGS=-I/Users/brey/miniconda3/envs/ptests_mpich/include -D_FORTIFY_SOURCE=2 -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -mmacosx-version-min=10.9 -I/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work/src/mpl/include -I/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work/src/mpl/include -I/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work/src/openpa/src -I/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work/src/openpa/src -D_REENTRANT -I/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work/src/mpi/romio/include' 'CXX=x86_64-apple-darwin13.4.0-clang++' 'CXXFLAGS=-I/Users/brey/miniconda3/envs/ptests_mpich/include -march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O2 -pipe -stdlib=libc++ -fvisibility-inlines-hidden -std=c++14 -fmessage-length=0 -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -fdebug-prefix-map=/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work=/usr/local/src/conda/mpich-3.3.2 -fdebug-prefix-map=/Users/brey/miniconda3/envs/ptests_mpich=/usr/local/src/conda-prefix -O2' 'FC=x86_64-apple-darwin13.4.0-gfortran' 'FCFLAGS=-I/Users/brey/miniconda3/envs/ptests_mpich/include -march=core2 -mtune=haswell -ftree-vectorize -fPIC -fstack-protector -O2 -pipe -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -fdebug-prefix-map=/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work=/usr/local/src/conda/mpich-3.3.2 -fdebug-prefix-map=/Users/brey/miniconda3/envs/ptests_mpich=/usr/local/src/conda-prefix -O2' 'FFLAGS=-I/Users/brey/miniconda3/envs/ptests_mpich/include -march=core2 -mtune=haswell -ftree-vectorize -fPIC -fstack-protector -O2 -pipe -isystem /Users/brey/miniconda3/envs/ptests_mpich/include -fdebug-prefix-map=/Users/runner/miniforge3/conda-bld/mpich-mpi_1605915212374/work=/usr/local/src/conda/mpich-3.3.2 -fdebug-prefix-map=/Users/brey/miniconda3/envs/ptests_mpich=/usr/local/src/conda-prefix -O2' '--cache-file=/dev/null' '--srcdir=.' 'LIBS=' 'MPLLIBNAME=mpl'
    Process Manager:                         pmi
    Launchers available:                     ssh rsh fork slurm ll lsf sge manual persist
    Topology libraries available:            hwloc
    Resource management kernels available:   user slurm ll lsf sge pbs cobalt
    Checkpointing libraries available:       
    Demux engines available:                 poll select
pmav99 commented 3 years ago

Are there any other MPI implementations that we need to support?

brey commented 3 years ago

SCHISM is working on incorporating openMP also. I am not sure how that will affect things in the future. openMP is also used by gmsh but it is still not switched on the conda implementation (I mean to make an issue upstream to the guys that maintain gmsh-feedstock)