pghysels / STRUMPACK

Structured Matrix Package (LBNL)
http://portal.nersc.gov/project/sparse/strumpack/
Other
167 stars 41 forks source link

Crash with updated PETSc interface using version 7.1.3 #109

Closed s-sajid-ali closed 1 year ago

s-sajid-ali commented 1 year ago

Using the updated STRUMPACK interface in PETSc-3.20.0 (with the GPU solve capabilities) results in a crash with the following trace:

Propagator: starting turn 1, final turn 1

[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: No support for this operation on this system
[0]PETSC ERROR: SLATE requires MPI_THREAD_MULTIPLE
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.20.0, Sep 28, 2023 
[0]PETSC ERROR: ./booster_fd on a  named wcgpu03.fnal.gov by sasyed Wed Oct 11 20:05:59 2023
[0]PETSC ERROR: Configure options --prefix=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h --with-ssl=0 --download-c2html=0 --download-sowing=0 --download-hwloc=0 --with-make-exec=make CFLAGS=-O3 COPTFLAGS= FFLAGS=-O3 FOPTFLAGS= CXXFLAGS=-O3 CXXOPTFLAGS= --with-cc=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/openmpi-4.1.5-dtavl3prebht2mfzsx7kipn7aasgnaql/bin/mpicc --with-cxx=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/openmpi-4.1.5-dtavl3prebht2mfzsx7kipn7aasgnaql/bin/mpic++ --with-fc=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/openmpi-4.1.5-dtavl3prebht2mfzsx7kipn7aasgnaql/bin/mpif90 --with-precision=double --with-scalar-type=real --with-shared-libraries=1 --with-debugging=0 --with-openmp=0 --with-64-bit-indices=0 --with-blaslapack-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/openblas-0.3.24-jye46jc22kvh6libzwermjws7mm5l6vt/lib/libopenblas.so --with-batch=1 --with-x=0 --with-clanguage=C --with-cuda=1 --with-cuda-dir=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/cuda-12.2.1-asmcxn2vp6whqwygr4jpsh4evklv3zve --with-hip=0 --with-metis=1 --with-metis-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/metis-5.1.0-jpueqfdfrqt6abqjvpityabqjbssdczi/include --with-metis-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/metis-5.1.0-jpueqfdfrqt6abqjvpityabqjbssdczi/lib/libmetis.so --with-hypre=1 --with-hypre-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hypre-2.29.0-yp5i4caa3gk25cazv4itvi2m63k32rv5/include --with-hypre-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hypre-2.29.0-yp5i4caa3gk25cazv4itvi2m63k32rv5/lib/libHYPRE.so --with-parmetis=1 --with-parmetis-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/parmetis-4.0.3-3pxngte3pxm2kvh4eeucnjj3bxosfbwp/include --with-parmetis-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/parmetis-4.0.3-3pxngte3pxm2kvh4eeucnjj3bxosfbwp/lib/libparmetis.so --with-kokkos=0 --with-kokkos-kernels=0 --with-superlu_dist=1 --with-superlu_dist-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/superlu-dist-8.1.2-usj7s2kc6d4kmigu3bljye5gjud6vxt7/include --with-superlu_dist-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/superlu-dist-8.1.2-usj7s2kc6d4kmigu3bljye5gjud6vxt7/lib/libsuperlu_dist.so --with-ptscotch=0 --with-suitesparse=0 --with-hdf5=1 --with-hdf5-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hdf5-1.14.2-afz27n65es4synnsemdew6hjxkhbh2d7/include --with-hdf5-lib="/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hdf5-1.14.2-afz27n65es4synnsemdew6hjxkhbh2d7/lib/libhdf5_hl.so /wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hdf5-1.14.2-afz27n65es4synnsemdew6hjxkhbh2d7/lib/libhdf5.so" --with-zlib=0 --with-mumps=0 --with-trilinos=0 --with-fftw=0 --with-valgrind=0 --with-gmp=0 --with-libpng=0 --with-giflib=0 --with-mpfr=0 --with-netcdf=0 --with-pnetcdf=0 --with-moab=0 --with-random123=0 --with-exodusii=0 --with-cgns=0 --with-memkind=0 --with-p4est=0 --with-saws=0 --with-yaml=0 --with-hwloc=0 --with-libjpeg=0 --with-scalapack=1 --with-scalapack-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/netlib-scalapack-2.2.0-xot4ujctnuvzgyb5rdjua67ab63agqyl/lib/libscalapack.so --with-strumpack=1 --with-strumpack-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/strumpack-7.1.3-i7dsreqo3p7jstuxowejz3jizjnnz4wu/include --with-strumpack-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/strumpack-7.1.3-i7dsreqo3p7jstuxowejz3jizjnnz4wu/lib64/libstrumpack.so --with-mmg=0 --with-parmmg=0 --with-tetgen=0 --with-cuda-arch=70
[0]PETSC ERROR: #1 MatGetFactor_aij_strumpack() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/mat/impls/aij/mpi/strumpack/strumpack.c:1153
[0]PETSC ERROR: #2 MatGetFactor() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/mat/interface/matrix.c:4783
[0]PETSC ERROR: #3 PCFactorSetUpMatSolverType_Factor() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/ksp/pc/impls/factor/factimpl.c:9
[0]PETSC ERROR: #4 PCFactorSetUpMatSolverType() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/ksp/pc/impls/factor/factor.c:105
[0]PETSC ERROR: #5 init_solver() at /wclustre/accelsim/sajid/packages/synergia2/src/synergia/collective/space_charge_3d_fd_utils.cc:313
[0]PETSC ERROR: #6 init_solver_sc3d_fd() at /wclustre/accelsim/sajid/packages/synergia2/src/synergia/collective/space_charge_3d_fd.cc:468
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR:   It appears a new error in the code was triggered after a previous error, possibly because:
[0]PETSC ERROR:   -  The first error was not properly handled via (for example) the use of
[0]PETSC ERROR:      PetscCall(TheFunctionThatErrors()); or
[0]PETSC ERROR:   -  The second error was triggered while handling the first error.
[0]PETSC ERROR:   Above is the traceback for the previous unhandled error, below the traceback for the next error
[0]PETSC ERROR:   ALL ERRORS in the PETSc libraries are fatal, you should add the appropriate error checking to the code
[0]PETSC ERROR: No support for this operation on this system
[0]PETSC ERROR: SLATE requires MPI_THREAD_MULTIPLE
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.20.0, Sep 28, 2023 
[0]PETSC ERROR: ./booster_fd on a  named wcgpu03.fnal.gov by sasyed Wed Oct 11 20:05:59 2023
[0]PETSC ERROR: Configure options --prefix=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h --with-ssl=0 --download-c2html=0 --download-sowing=0 --download-hwloc=0 --with-make-exec=make CFLAGS=-O3 COPTFLAGS= FFLAGS=-O3 FOPTFLAGS= CXXFLAGS=-O3 CXXOPTFLAGS= --with-cc=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/openmpi-4.1.5-dtavl3prebht2mfzsx7kipn7aasgnaql/bin/mpicc --with-cxx=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/openmpi-4.1.5-dtavl3prebht2mfzsx7kipn7aasgnaql/bin/mpic++ --with-fc=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/openmpi-4.1.5-dtavl3prebht2mfzsx7kipn7aasgnaql/bin/mpif90 --with-precision=double --with-scalar-type=real --with-shared-libraries=1 --with-debugging=0 --with-openmp=0 --with-64-bit-indices=0 --with-blaslapack-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/openblas-0.3.24-jye46jc22kvh6libzwermjws7mm5l6vt/lib/libopenblas.so --with-batch=1 --with-x=0 --with-clanguage=C --with-cuda=1 --with-cuda-dir=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/cuda-12.2.1-asmcxn2vp6whqwygr4jpsh4evklv3zve --with-hip=0 --with-metis=1 --with-metis-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/metis-5.1.0-jpueqfdfrqt6abqjvpityabqjbssdczi/include --with-metis-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/metis-5.1.0-jpueqfdfrqt6abqjvpityabqjbssdczi/lib/libmetis.so --with-hypre=1 --with-hypre-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hypre-2.29.0-yp5i4caa3gk25cazv4itvi2m63k32rv5/include --with-hypre-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hypre-2.29.0-yp5i4caa3gk25cazv4itvi2m63k32rv5/lib/libHYPRE.so --with-parmetis=1 --with-parmetis-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/parmetis-4.0.3-3pxngte3pxm2kvh4eeucnjj3bxosfbwp/include --with-parmetis-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/parmetis-4.0.3-3pxngte3pxm2kvh4eeucnjj3bxosfbwp/lib/libparmetis.so --with-kokkos=0 --with-kokkos-kernels=0 --with-superlu_dist=1 --with-superlu_dist-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/superlu-dist-8.1.2-usj7s2kc6d4kmigu3bljye5gjud6vxt7/include --with-superlu_dist-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/superlu-dist-8.1.2-usj7s2kc6d4kmigu3bljye5gjud6vxt7/lib/libsuperlu_dist.so --with-ptscotch=0 --with-suitesparse=0 --with-hdf5=1 --with-hdf5-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hdf5-1.14.2-afz27n65es4synnsemdew6hjxkhbh2d7/include --with-hdf5-lib="/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hdf5-1.14.2-afz27n65es4synnsemdew6hjxkhbh2d7/lib/libhdf5_hl.so /wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/hdf5-1.14.2-afz27n65es4synnsemdew6hjxkhbh2d7/lib/libhdf5.so" --with-zlib=0 --with-mumps=0 --with-trilinos=0 --with-fftw=0 --with-valgrind=0 --with-gmp=0 --with-libpng=0 --with-giflib=0 --with-mpfr=0 --with-netcdf=0 --with-pnetcdf=0 --with-moab=0 --with-random123=0 --with-exodusii=0 --with-cgns=0 --with-memkind=0 --with-p4est=0 --with-saws=0 --with-yaml=0 --with-hwloc=0 --with-libjpeg=0 --with-scalapack=1 --with-scalapack-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/netlib-scalapack-2.2.0-xot4ujctnuvzgyb5rdjua67ab63agqyl/lib/libscalapack.so --with-strumpack=1 --with-strumpack-include=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/strumpack-7.1.3-i7dsreqo3p7jstuxowejz3jizjnnz4wu/include --with-strumpack-lib=/wclustre/accelsim/spack-shared-v5/spack/opt/spack/linux-scientific7-cascadelake/gcc-12.2.0/strumpack-7.1.3-i7dsreqo3p7jstuxowejz3jizjnnz4wu/lib64/libstrumpack.so --with-mmg=0 --with-parmmg=0 --with-tetgen=0 --with-cuda-arch=70
[0]PETSC ERROR: #1 MatGetFactor_aij_strumpack() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/mat/impls/aij/mpi/strumpack/strumpack.c:1153
[0]PETSC ERROR: #2 MatGetFactor() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/mat/interface/matrix.c:4783
[0]PETSC ERROR: #3 PCFactorSetUpMatSolverType_Factor() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/ksp/pc/impls/factor/factimpl.c:9
[0]PETSC ERROR: #4 PCFactorSetUpMatSolverType() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/ksp/pc/impls/factor/factor.c:105
[0]PETSC ERROR: #5 PCSetUp_LU() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/ksp/pc/impls/factor/lu/lu.c:80
[0]PETSC ERROR: #6 PCSetUp() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/ksp/pc/interface/precon.c:1068
[0]PETSC ERROR: #7 KSPSetUp() at /tmp/sasyed/spack-stage/spack-stage-petsc-3.20.0-autl7wxw7ic2zkl7jq4s2mi225qtnu4h/spack-src/src/ksp/ksp/interface/itfunc.c:415
[0]PETSC ERROR: #8 compute_mat() at /wclustre/accelsim/sajid/packages/synergia2/src/synergia/collective/space_charge_3d_fd_utils.cc:420
[0]PETSC ERROR: #9 apply_impl() at /wclustre/accelsim/sajid/packages/synergia2/src/synergia/collective/space_charge_3d_fd.cc:140
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 11 DUP FROM 4
with errorcode 57.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
-bash-4.2$ 

The MPI library used is OpenMPI@4.1.5 which has MPI_THREAD_MULTIPLE support and here is the result of ompi_info : ompi_info.txt. Let me know if I can provide further context to help debug the issue.

Here is the full spec for the PETSc+STRUMPACK build:

-bash-4.2$ spack find -ldv petsc
-- linux-scientific7-cascadelake / gcc@12.2.0 -------------------
autl7wx petsc@3.20.0~X+batch~cgns~complex+cuda~debug+double~exodusii~fftw+fortran~giflib+hdf5~hpddm~hwloc+hypre~int64~jpeg~knl~kokkos~libpng~libyaml~memkind+metis~mkl-pardiso~mmg~moab~mpfr+mpi~mumps~openmp~p4est~parmmg~ptscotch~random123~rocm~saws~scalapack+shared+strumpack~suite-sparse+superlu-dist~tetgen~trilinos~valgrind build_system=generic clanguage=C cuda_arch=70 memalign=none
asmcxn2     cuda@12.2.1~allow-unsupported-compilers~dev build_system=generic
bxsrrsb         libxml2@2.10.3+pic~python+shared build_system=autotools
mrr5lel     diffutils@3.9 build_system=autotools
pra36hz         libiconv@1.17 build_system=autotools libs=shared,static
afz27n6     hdf5@1.14.2~cxx~fortran+hl~ipo~java~map+mpi+shared~szip~threadsafe+tools api=default build_system=cmake build_type=Release generator=make
kkhue7l         cmake@3.27.6~doc+ncurses+ownlibs build_system=generic build_type=Release
lecuo7k             curl@8.1.2~gssapi~ldap~libidn2~librtmp~libssh~libssh2+nghttp2 build_system=autotools libs=shared,static tls=openssl
ekyvux6                 nghttp2@1.52.0 build_system=autotools
2pewpvy         gmake@4.4.1~guile build_system=autotools
c7xlycj         pkgconf@1.9.5 build_system=autotools
yp5i4ca     hypre@2.29.0~caliper~complex~cuda~debug+fortran~gptune~int64~internal-superlu~magma~mixedint+mpi~openmp~rocm+shared~superlu-dist~sycl~umpire~unified-memory build_system=autotools
jpueqfd     metis@5.1.0~gdb~int64~ipo~real64+shared build_system=cmake build_type=Release generator=make patches=4991da9,93a7903,b1225da
xot4ujc     netlib-scalapack@2.2.0~ipo~pic+shared build_system=cmake build_type=Release generator=make patches=072b006,1c9ce5f,244a9aa
jye46jc     openblas@0.3.24~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile symbol_suffix=none threads=openmp
nxsciii         perl@5.38.0+cpanm+opcode+open+shared+threads build_system=generic patches=714e4d1
ohh7me5             berkeley-db@18.1.40+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc
dtavl3p     openmpi@4.1.5~atomics+cuda~cxx~cxx_exceptions~gpfs+internal-hwloc~internal-pmix~java+legacylaunchers~lustre~memchecker~openshmem~orterunprefix~pmi+romio+rsh~singularity+static+vt+wrapper-rpath build_system=autotools cuda_arch=70 fabrics=ucx schedulers=slurm
g75dvtn         numactl@2.0.14 build_system=autotools patches=4e1d78c,62fc8a8,ff37630
k3qosik             autoconf@2.69 build_system=autotools patches=7793209
y5p2bp7             automake@1.16.5 build_system=autotools
uuxioem             libtool@2.4.7 build_system=autotools
xolk7pu             m4@1.4.19+sigsegv build_system=autotools patches=9dc5fbd,bfdffa7
uwbt5n3                 libsigsegv@2.14 build_system=autotools
brrz6u3         openssh@7.4p1+gssapi build_system=autotools
uersbae         pmix@3.2.3~docs+pmi_backwards_compatibility build_system=autotools
qwhjtc2             hwloc@2.9.1~cairo~cuda~gl~libudev+libxml2~netloc~nvml~oneapi-level-zero~opencl+pci~rocm build_system=autotools libs=shared,static
bklgapw                 libpciaccess@0.17 build_system=autotools
ygtbb3c                     util-macros@1.19.3 build_system=autotools
zd2xq4n             libevent@2.1.12+openssl build_system=autotools
7ul4oik         slurm@21.08.8~gtk~hdf5~hwloc~mariadb~pmix+readline~restd build_system=autotools sysconfdir=PREFIX/etc
thyonny         ucx@1.14.1~assertions~backtrace_detail~cma+cuda~dc~debug~dm+examples+gdrcopy~gtest~ib_hw_tm~java~knem~logging~mlx5_dv+numa+openmp+optimizations~parameter_checking+pic~rc~rdmacm~rocm+thread_multiple~ucg~ud~verbs~vfs~xpmem build_system=autotools cuda_arch=70 libs=shared,static opt=3 simd=auto
okoex6q             gdrcopy@2.3+cuda build_system=makefile cuda_arch=none patches=c5efec1
kavlqf4                 check@0.15.2 build_system=autotools
3pxngte     parmetis@4.0.3~gdb~int64~ipo+shared build_system=cmake build_type=Release generator=make patches=4f89253,50ed208,704b84f
y4rq447     python@3.11.6+bz2+crypt+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tkinter+uuid+zlib build_system=generic patches=13fa8bf,b0615b2,ebdca64,f2fd060
ootiije         bzip2@1.0.8~debug~pic+shared build_system=generic
p23cxsw         expat@2.5.0+libbsd build_system=autotools
ep6dr6b             libbsd@0.11.7 build_system=autotools
uje4q2w                 libmd@1.0.4 build_system=autotools
ndo3nex         gdbm@1.23 build_system=autotools
mp7bgol         gettext@0.21.1+bzip2+curses+git~libunistring+libxml2+pic+shared+tar+xz build_system=autotools
6u2mptm             tar@1.34 build_system=autotools zip=pigz
n6amyeh                 pigz@2.7 build_system=makefile
bzi57fk                 zstd@1.5.5+programs build_system=makefile compression=none libs=shared,static
mz7rmqi         libffi@3.4.4 build_system=autotools
ycarvo2         libxcrypt@4.4.35~obsolete_api build_system=autotools patches=4885da3
2mtzf22             perl@5.16.3~cpanm+opcode+open+shared+threads build_system=generic patches=0eac10e,3bbd7d6
7uz3g43         ncurses@6.4~symlinks+termlib abi=none build_system=autotools
mklnkjv         openssl@3.1.3~docs+shared build_system=generic certs=mozilla
p7wbwjf             ca-certificates-mozilla@2023-05-30 build_system=generic
z6jybvz         readline@8.2 build_system=autotools patches=bbf97f1
hhvjtuk         sqlite@3.42.0+column_metadata+dynamic_extensions+fts~functions+rtree build_system=autotools
6dtejgq         util-linux-uuid@2.38.1 build_system=autotools
2acbuus         xz@5.4.1~pic build_system=autotools libs=shared,static
i7dsreq     strumpack@7.1.3+butterflypack+c_interface~count_flops+cuda~ipo+magma+mpi+openmp+parmetis~rocm~scotch+shared+slate~task_timers+zfp build_system=cmake build_type=Release cuda_arch=70 generator=make
ogypblg         butterflypack@2.2.2~ipo+openmp+shared build_system=cmake build_type=Release generator=make
go6pv6n             arpack-ng@3.9.0~icb~ipo+mpi+shared build_system=cmake build_type=Release generator=make
2jagnv6             sed@4.9 build_system=autotools
nhcjm3v         magma@2.7.2+cuda+fortran~ipo~rocm+shared build_system=cmake build_type=Release cuda_arch=70 generator=make
mxew33s         slate@2023.08.25+cuda~ipo+mpi+openmp~rocm+shared~sycl build_system=cmake build_type=Release cuda_arch=70 generator=make
ioeb5ex             blaspp@2023.08.25+cuda~ipo+openmp~rocm+shared~sycl build_system=cmake build_type=Release cuda_arch=70 generator=make
fyy5jnd             lapackpp@2023.08.25+cuda~ipo~rocm+shared~sycl build_system=cmake build_type=Release cuda_arch=70 generator=make
nuftelp         zfp@0.5.5~aligned~c~cuda~fasthash~fortran~ipo~openmp~profile~python+shared~strided~twoway+utilities bsws=64 build_system=cmake build_type=Release generator=make
usj7s2k     superlu-dist@8.1.2~cuda~int64~ipo~openmp~rocm+shared build_system=cmake build_type=Release generator=make
n2shs5h     zlib-ng@2.1.3+compat+opt build_system=autotools patches=299b958,ae9077a,b692621

==> 1 installed package
-bash-4.2$ 
pghysels commented 1 year ago

Was MPI initialized with MPI_Init_thread with MPI_THREAD_MULTIPLE ? I think the easiest way to do that is shown here on line 50: ex52

s-sajid-ali commented 1 year ago

Thanks for the explanation!