GlobalArrays / ga

Partitioned Global Address Space (PGAS) library for distributed arrays
http://hpc.pnl.gov/globalarrays/
Other
101 stars 38 forks source link

Build fails with Intel MKL #308

Closed samfux84 closed 5 months ago

samfux84 commented 1 year ago

Hi,

I am trying to build GA as part of OpenMolcas on a Linux cluster and would like to use the Intel MKL (ilp64) libraries as BLAS/LAPACK librarires.

But the build always failes with

-- Could NOT find BLAS (missing: BLAS_LINK_OK ilp64)
CMake Error at cmake/ga-linalg.cmake:159 (message):
  ENABLE_BLAS=ON, but a LAPACK library was not found
Call Stack (most recent call first):
  CMakeLists.txt:154 (include)

The CMake command is:

cd /scratch/260965700.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build && /cluster/apps/gcc-8.2.0/cmake-3.25.0-45vsyctxfmoqynitpwn6qiaqrlfn5mvr/bin/cmake -D CMAKE_INSTALL_PREFIX=/scratch/260965700.tmpdir/OpenMolcas/build/External/GlobalArrays_install -D CMAKE_INSTALL_RPATH_USE_LINK_PATH=TRUE -D BUILD_SHARED_LIBS=ON -D ENABLE_FORTRAN=ON -D ENABLE_CXX=OFF -D ENABLE_TESTS=OFF -D ENABLE_BLAS=ON -D LINALG_VENDOR=IntelMKL -D LINALG_PREFIX=/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2 -D LINALG_THREAD_LAYER=sequential -D LINALG_REQUIRED_COMPONENTS=ilp64 -D "BLAS_LIBRARIES=/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_scalapack_ilp64.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_gf_ilp64.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_core.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_gnu_thread.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_blacs_openmpi_ilp64.so" -D ENABLE_I8=ON -D LINALG_REQUIRED_COMPONENTS=ilp64 "-GUnix Makefiles" /scratch/260965700.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays

Is this a known problem and is there any fix to resolve it?

Best regards

Sam

ajaypanyala commented 1 year ago

Hi @samfux84, can you try only the following

cd /scratch/260965700.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build && /cluster/apps/gcc-8.2.0/cmake-3.25.0-45vsyctxfmoqynitpwn6qiaqrlfn5mvr/bin/cmake -D CMAKE_INSTALL_PREFIX=/scratch/260965700.tmpdir/OpenMolcas/build/External/GlobalArrays_install -D CMAKE_INSTALL_RPATH_USE_LINK_PATH=TRUE -D BUILD_SHARED_LIBS=ON -D ENABLE_FORTRAN=ON -D ENABLE_CXX=OFF -D ENABLE_TESTS=OFF -D ENABLE_BLAS=ON -D LINALG_VENDOR=IntelMKL -D LINALG_PREFIX=/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2 -D LINALG_THREAD_LAYER=sequential -D LINALG_REQUIRED_COMPONENTS=ilp64

samfux84 commented 1 year ago

Hi @ajaypanyala , thank you for your reply.

I have tried with

[ 1%] Performing configure step for 'GlobalArrays' cd /scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build && /cluster/apps/gcc-8.2.0/cmake-3.25.0-45vsyctxfmoqynitpwn6qiaqrlfn5mvr/bin/cmake -D CMAKE_INSTALL_PREFIX=/scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays_install -D CMAKE_INSTALL_RPATH_USE_LINK_PATH=TRUE -D BUILD_SHARED_LIBS=ON -D ENABLE_FORTRAN=ON -D ENABLE_CXX=OFF -D ENABLE_TESTS=OFF -D ENABLE_BLAS=ON -D LINALG_VENDOR=IntelMKL -D LINALG_PREFIX=/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2 -D LINALG_THREAD_LAYER=sequential -D LINALG_REQUIRED_COMPONENTS=ilp64 -D LINALG_PREFIX=/scratch/260997378.tmpdir/OpenMolcas/build -D ENABLE_I8=ON -D LINALG_REQUIRED_COMPONENTS=ilp64 "-GUnix Makefiles" /scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays

omitting the BLAS_LIBRARIES variable, but I still get the same outcome:

[  1%] Performing configure step for 'GlobalArrays'
cd /scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build && /cluster/apps/gcc-8.2.0/cmake-3.25.0-45vsyctxfmoqynitpwn6qiaqrlfn5mvr/bin/cmake -D CMAKE_INSTALL_PREFIX=/scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays_install -D CMAKE_INSTALL_RPATH_USE_LINK_PATH=TRUE -D BUILD_SHARED_LIBS=ON -D ENABLE_FORTRAN=ON -D ENABLE_CXX=OFF -D ENABLE_TESTS=OFF -D ENABLE_BLAS=ON -D LINALG_VENDOR=IntelMKL -D LINALG_PREFIX=/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2 -D LINALG_THREAD_LAYER=sequential -D LINALG_REQUIRED_COMPONENTS=ilp64 -D LINALG_PREFIX=/scratch/260997378.tmpdir/OpenMolcas/build -D ENABLE_I8=ON -D LINALG_REQUIRED_COMPONENTS=ilp64 "-GUnix Makefiles" /scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays
-- The C compiler identification is GNU 8.2.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /cluster/spack/apps/linux-centos7-x86_64/gcc-4.8.5/gcc-8.2.0-6xqov2fhvbmehix42slain67vprec3fs/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- CMAKE_MODULE_PATH: /scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays/cmake/linalg-modules;/scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays/cmake
-- Value of ENABLE_CXX was set by user to : OFF
-- Value of ENABLE_FORTRAN was set by user to : ON
-- Setting value of CMAKE_CXX_EXTENSIONS to default : OFF
-- The Fortran compiler identification is GNU 8.2.0
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Check for working Fortran compiler: /cluster/spack/apps/linux-centos7-x86_64/gcc-4.8.5/gcc-8.2.0-6xqov2fhvbmehix42slain67vprec3fs/bin/gfortran - skipped
-- Setting value of CMAKE_BUILD_TYPE to default : Release
-- Value of LINALG_VENDOR was set by user to : IntelMKL
-- Value of ENABLE_TESTS was set by user to : OFF
-- Setting value of ENABLE_PROFILING to default : OFF
-- Value of ENABLE_BLAS was set by user to : ON
-- Setting value of ENABLE_SCALAPACK to default : OFF
-- Setting value of ENABLE_EISPACK to default : OFF
-- Setting value of ENABLE_DPCPP to default : OFF
-- Setting value of GA_RUNTIME to default : MPI_2SIDED
-- Checking MPI ...
-- Found MPI_C: /cluster/apps/gcc-8.2.0/openmpi-4.1.4-zsgryzeyf2z5uqn4ptv3j4ectbwicehy/lib/libmpi.so (found version "3.1")
-- Found MPI_Fortran: /cluster/apps/gcc-8.2.0/openmpi-4.1.4-zsgryzeyf2z5uqn4ptv3j4ectbwicehy/lib/libmpi_usempif08.so (found version "3.1")
-- Found MPI: TRUE (found version "3.1")
-- Performing Test HAVE_RESTRICT
-- Performing Test HAVE_RESTRICT - Success
-- Performing Test HAVE_INLINE_NATIVE
-- Performing Test HAVE_INLINE_NATIVE - Success
-- Performing Test HAVE_PAUSE
-- Performing Test HAVE_PAUSE - Success
-- Performing Test HAVE_LONG_DOUBLE
-- Performing Test HAVE_LONG_DOUBLE - Success
-- Performing Test HAVE_SYS_WEAK_ALIAS_PRAGMA
-- Performing Test HAVE_SYS_WEAK_ALIAS_PRAGMA - Success
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for stdint.h
-- Looking for stdint.h - found
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Check size of int
-- Check size of int - done
-- Check size of double
-- Check size of double - done
-- Check size of float
-- Check size of float - done
-- Check size of long
-- Check size of long - done
-- Check size of long double
-- Check size of long double - done
-- Check size of long long
-- Check size of long long - done
-- Check size of short
-- Check size of short - done
-- Looking for include file assert.h
-- Looking for include file assert.h - found
-- Looking for include file limits.h
-- Looking for include file limits.h - found
-- Looking for include file linux/limits.h
-- Looking for include file linux/limits.h - found
-- Looking for include file malloc.h
-- Looking for include file malloc.h - found
-- Looking for include file math.h
-- Looking for include file math.h - found
-- Looking for include file stdio.h
-- Looking for include file stdio.h - found
-- Looking for include file stdlib.h
-- Looking for include file stdlib.h - found
-- Looking for include file strings.h
-- Looking for include file strings.h - found
-- Looking for include file string.h
-- Looking for include file string.h - found
-- Looking for include file unistd.h
-- Looking for include file unistd.h - found
-- Looking for include file windows.h
-- Looking for include file windows.h - not found
-- Looking for bzero
-- Looking for bzero - found
-- Detecting Fortran/C Interface
-- Detecting Fortran/C Interface - Found GLOBAL and MODULE mangling
-- void size: 8, USE_I8: ON, ENABLE_I8: ON
-- BLAS_LIBRARIES Not Given: Will Perform Search
-- Could NOT find IntelMKL (missing: IntelMKL_LIBRARIES IntelMKL_INCLUDE_DIR ilp64 ilp64)
-- Performing Test BLAS_LOWER_UNDERSCORE
-- Performing Test BLAS_LOWER_UNDERSCORE -- not found
-- Performing Test BLAS_LOWER_NO_UNDERSCORE
-- Performing Test BLAS_LOWER_NO_UNDERSCORE -- not found
-- Performing Test BLAS_UPPER_UNDERSCORE
-- Performing Test BLAS_UPPER_UNDERSCORE -- not found
-- Performing Test BLAS_UPPER_NO_UNDERSCORE
-- Performing Test BLAS_UPPER_NO_UNDERSCORE -- not found
-- Could NOT find BLAS (missing: BLAS_LINK_OK ilp64)
CMake Error at cmake/ga-linalg.cmake:159 (message):
  ENABLE_BLAS=ON, but a LAPACK library was not found
Call Stack (most recent call first):
  CMakeLists.txt:154 (include)

-- Configuring incomplete, errors occurred!
See also "/scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeOutput.log".
See also "/scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeError.log".
make[2]: *** [CMakeFiles/GlobalArrays.dir/build.make:99: External/GlobalArrays/src/GlobalArrays-stamp/GlobalArrays-configure] Error 1
make[2]: Leaving directory '/scratch/260997378.tmpdir/OpenMolcas/build'
make[1]: *** [CMakeFiles/Makefile2:2537: CMakeFiles/GlobalArrays.dir/all] Error 2

The MKL installation is present in the path that I specified as LINALG_PREFIX:


[sfux@eu-c7-102-01 build]$ ls /cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2
benchmarks  bin  documentation  env  examples  include  interfaces  lib  licensing  modulefiles  tools
[sfux@eu-c7-102-01 build]$ ls /cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/bin
ia32  intel64
[sfux@eu-c7-102-01 build]$ ls /cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib
cmake  ia32  intel64  pkgconfig
[sfux@eu-c7-102-01 build]$ ls /cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/
libmkl_avx2.so.2                  libmkl_blacs_openmpi_ilp64.so    libmkl_core.a         libmkl_gnu_thread.a      libmkl_intel_thread.so    libmkl_rt.so.2               libmkl_sycl.a           libmkl_vml_def.so.2
libmkl_avx512.so.2                libmkl_blacs_openmpi_ilp64.so.2  libmkl_core.so        libmkl_gnu_thread.so     libmkl_intel_thread.so.2  libmkl_scalapack_ilp64.a     libmkl_sycl.so          libmkl_vml_mc2.so.2
libmkl_avx.so.2                   libmkl_blacs_openmpi_lp64.a      libmkl_core.so.2      libmkl_gnu_thread.so.2   libmkl_lapack95_ilp64.a   libmkl_scalapack_ilp64.so    libmkl_sycl.so.2        libmkl_vml_mc3.so.2
libmkl_blacs_intelmpi_ilp64.a     libmkl_blacs_openmpi_lp64.so     libmkl_def.so.2       libmkl_intel_ilp64.a     libmkl_lapack95_lp64.a    libmkl_scalapack_ilp64.so.2  libmkl_tbb_thread.a     libmkl_vml_mc.so.2
libmkl_blacs_intelmpi_ilp64.so    libmkl_blacs_openmpi_lp64.so.2   libmkl_gf_ilp64.a     libmkl_intel_ilp64.so    libmkl_mc3.so.2           libmkl_scalapack_lp64.a      libmkl_tbb_thread.so    locale
libmkl_blacs_intelmpi_ilp64.so.2  libmkl_blas95_ilp64.a            libmkl_gf_ilp64.so    libmkl_intel_ilp64.so.2  libmkl_mc.so.2            libmkl_scalapack_lp64.so     libmkl_tbb_thread.so.2
libmkl_blacs_intelmpi_lp64.a      libmkl_blas95_lp64.a             libmkl_gf_ilp64.so.2  libmkl_intel_lp64.a      libmkl_pgi_thread.a       libmkl_scalapack_lp64.so.2   libmkl_vml_avx2.so.2
libmkl_blacs_intelmpi_lp64.so     libmkl_cdft_core.a               libmkl_gf_lp64.a      libmkl_intel_lp64.so     libmkl_pgi_thread.so      libmkl_sequential.a          libmkl_vml_avx512.so.2
libmkl_blacs_intelmpi_lp64.so.2   libmkl_cdft_core.so              libmkl_gf_lp64.so     libmkl_intel_lp64.so.2   libmkl_pgi_thread.so.2    libmkl_sequential.so         libmkl_vml_avx.so.2
libmkl_blacs_openmpi_ilp64.a      libmkl_cdft_core.so.2            libmkl_gf_lp64.so.2   libmkl_intel_thread.a    libmkl_rt.so              libmkl_sequential.so.2       libmkl_vml_cmpt.so.2
[sfux@eu-c7-102-01 build]$
ajaypanyala commented 1 year ago

Could you please post the file /scratch/260997378.tmpdir/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeError.log

samfux84 commented 1 year ago

Sorry for the late reply. I was away for a bit more than a week and could only reproduce the issue this week.


[sfux@eu-g1-018-1 CMakeFiles]$ pwd
/scratch/tmp.21278477.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles
[sfux@eu-g1-018-1 CMakeFiles]$ cat CMakeError.log
Determining if files windows.h exist failed with the following output:
Change Dir: /scratch/tmp.21278477.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeScratch/TryCompile-8aXvu1

Run Build Command(s):/cluster/apps/sfos/bin/gmake -f Makefile cmTC_af659/fast && gmake[3]: Entering directory '/scratch/tmp.21278477.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeScratch/TryCompile-8aXvu1'
/cluster/apps/sfos/bin/gmake  -f CMakeFiles/cmTC_af659.dir/build.make CMakeFiles/cmTC_af659.dir/build
gmake[4]: Entering directory '/scratch/tmp.21278477.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeScratch/TryCompile-8aXvu1'
Building C object CMakeFiles/cmTC_af659.dir/HAVE_WINDOWS_H.c.o
/cluster/spack/apps/linux-centos7-x86_64/gcc-4.8.5/gcc-8.2.0-6xqov2fhvbmehix42slain67vprec3fs/bin/gcc   -fPIE -o CMakeFiles/cmTC_af659.dir/HAVE_WINDOWS_H.c.o -c /scratch/tmp.21278477.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeScratch/TryCompile-8aXvu1/HAVE_WINDOWS_H.c
/scratch/tmp.21278477.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeScratch/TryCompile-8aXvu1/HAVE_WINDOWS_H.c:2:10: fatal error: windows.h: No such file or directory
 #include <windows.h>
          ^~~~~~~~~~~
compilation terminated.
gmake[4]: *** [CMakeFiles/cmTC_af659.dir/build.make:78: CMakeFiles/cmTC_af659.dir/HAVE_WINDOWS_H.c.o] Error 1
gmake[4]: Leaving directory '/scratch/tmp.21278477.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeScratch/TryCompile-8aXvu1'
gmake[3]: *** [Makefile:127: cmTC_af659/fast] Error 2
gmake[3]: Leaving directory '/scratch/tmp.21278477.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeScratch/TryCompile-8aXvu1'

Source:
/* */
#include <windows.h>

int main(void){return 0;}

[sfux@eu-g1-018-1 CMakeFiles]$
jeffhammond commented 1 year ago

if you are building on linux, the test for windows should fail.

samfux84 commented 1 year ago

@jeffhammond I was asked by @ajaypanyala to provide the content of this file.

It still does not explain why the error

CMake Error at cmake/ga-linalg.cmake:159 (message):
  ENABLE_BLAS=ON, but a LAPACK library was not found

happens with MKL

ajaypanyala commented 1 year ago

@samfux84 The cmake error log does not seem to have any useful information. Could you please try checking out the branch issue308 (added some prints to debug) and paste the output of the your GA CMake command ?

samfux84 commented 1 year ago

@ajaypanyala Thank you for your reply. I have to stop for today, but I will continue to investigate the issue tomorrow and also test the branch issue308 and then provide you feedback.

samfux84 commented 1 year ago

@ajaypanyala I cannot easily change the commit of GA that OpenMolcas checks out (not sure where I have to change this in the OpenMolcas CMake files. I tried to change it, but failed to succeed), so I had to manually repeat the steps that OpenMolcas is performing and make sure the correct branch "issue308" gets checked out. With that branch, the LAPACK library of MKL is actually found, when I omit the variable -D BLAS_LIBRARIES:

[sfux@eu-g1-018-1 GlobalArrays-build]$ cmake -D BUILD_SHARED_LIBS=ON -D ENABLE_FORTRAN=ON -D ENABLE_CXX=OFF -D ENABLE_TESTS=OFF -D ENABLE_BLAS=ON -D LINALG_VENDOR=IntelMKL -D LINALG_PREFIX=/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2 -D LINALG_THREAD_LAYER=sequential -D LINALG_REQUIRED_COMPONENTS=ilp64 ../GlobalArrays
-- The C compiler identification is GNU 8.2.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /cluster/spack/apps/linux-centos7-x86_64/gcc-4.8.5/gcc-8.2.0-6xqov2fhvbmehix42slain67vprec3fs/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- CMAKE_MODULE_PATH: /scratch/tmp.21408580.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays/cmake/linalg-modules;/scratch/tmp.21408580.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays/cmake
-- Value of ENABLE_CXX was set by user to : OFF
-- Value of ENABLE_FORTRAN was set by user to : ON
-- Setting value of CMAKE_CXX_EXTENSIONS to default : OFF
-- The Fortran compiler identification is GNU 8.2.0
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Check for working Fortran compiler: /cluster/spack/apps/linux-centos7-x86_64/gcc-4.8.5/gcc-8.2.0-6xqov2fhvbmehix42slain67vprec3fs/bin/gfortran - skipped
-- Setting value of CMAKE_BUILD_TYPE to default : Release
-- Value of LINALG_VENDOR was set by user to : IntelMKL
-- Value of ENABLE_TESTS was set by user to : OFF
-- Setting value of ENABLE_PROFILING to default : OFF
-- Setting value of ENABLE_SYSV to default : OFF
-- Value of ENABLE_BLAS was set by user to : ON
-- Setting value of ENABLE_SCALAPACK to default : OFF
-- Setting value of ENABLE_EISPACK to default : OFF
-- Setting value of ENABLE_DPCPP to default : OFF
-- Setting value of ENABLE_DEV_MODE to default : OFF
-- Check GNU compiler versions.
-- Setting value of GA_RUNTIME to default : MPI_2SIDED
-- Checking MPI ...
-- Found MPI_C: /cluster/apps/gcc-8.2.0/openmpi-4.1.4-zsgryzeyf2z5uqn4ptv3j4ectbwicehy/lib/libmpi.so (found version "3.1")
-- Found MPI_Fortran: /cluster/apps/gcc-8.2.0/openmpi-4.1.4-zsgryzeyf2z5uqn4ptv3j4ectbwicehy/lib/libmpi_usempif08.so (found version "3.1")
-- Found MPI: TRUE (found version "3.1")
-- Performing Test HAVE_RESTRICT
-- Performing Test HAVE_RESTRICT - Success
-- Performing Test HAVE_INLINE_NATIVE
-- Performing Test HAVE_INLINE_NATIVE - Success
-- Performing Test HAVE_PAUSE
-- Performing Test HAVE_PAUSE - Success
-- Performing Test HAVE_LONG_DOUBLE
-- Performing Test HAVE_LONG_DOUBLE - Success
-- Performing Test HAVE_SYS_WEAK_ALIAS_PRAGMA
-- Performing Test HAVE_SYS_WEAK_ALIAS_PRAGMA - Success
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for stdint.h
-- Looking for stdint.h - found
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Check size of int
-- Check size of int - done
-- Check size of double
-- Check size of double - done
-- Check size of float
-- Check size of float - done
-- Check size of long
-- Check size of long - done
-- Check size of long double
-- Check size of long double - done
-- Check size of long long
-- Check size of long long - done
-- Check size of short
-- Check size of short - done
-- Looking for include file assert.h
-- Looking for include file assert.h - found
-- Looking for include file limits.h
-- Looking for include file limits.h - found
-- Looking for include file linux/limits.h
-- Looking for include file linux/limits.h - found
-- Looking for include file malloc.h
-- Looking for include file malloc.h - found
-- Looking for include file math.h
-- Looking for include file math.h - found
-- Looking for include file stdio.h
-- Looking for include file stdio.h - found
-- Looking for include file stdlib.h
-- Looking for include file stdlib.h - found
-- Looking for include file strings.h
-- Looking for include file strings.h - found
-- Looking for include file string.h
-- Looking for include file string.h - found
-- Looking for include file unistd.h
-- Looking for include file unistd.h - found
-- Looking for include file windows.h
-- Looking for include file windows.h - not found
-- Looking for bzero
-- Looking for bzero - found
-- Detecting Fortran/C Interface
-- Detecting Fortran/C Interface - Found GLOBAL and MODULE mangling
-- void size: 8, USE_I8: ON, ENABLE_I8: ON
-- BLAS_LIBRARIES Not Given: Will Perform Search
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE
-- Found IntelMKL: -Wl,--no-as-needed;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_intel_ilp64.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_sequential.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_core.so;m;dl;Threads::Threads (found version "2022.0.0") found components: ilp64 ilp64
-- Performing Test BLAS_LOWER_UNDERSCORE
-- Performing Test BLAS_LOWER_UNDERSCORE -- found
-- Found BLAS: TRUE  found components: ilp64
-- LAPACK_LIBRARIES Not Given: Checking for LAPACK in BLAS
-- Performing Test LAPACK_LOWER_UNDERSCORE
-- Performing Test LAPACK_LOWER_UNDERSCORE -- found
-- BLAS Has A Full LAPACK Linker
-- Found LAPACK: TRUE
-- Performing Test BLAS_LOWER_UNDERSCORE
-- Performing Test BLAS_LOWER_UNDERSCORE -- found
-- Found BLAS: TRUE
-- HAVE_BLAS: 1
-- HAVE_LAPACK: 1
-- HAVE_SCALAPACK:
-- BLAS_LIBRARIES: -Wl,--no-as-needed;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_intel_ilp64.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_sequential.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_core.so;m;dl;Threads::Threads
-- LAPACK_LIBRARIES: -Wl,--no-as-needed;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_intel_ilp64.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_sequential.so;/cluster/apps/nss/intel/oneapi/2022.1.2/mkl/2022.0.2/lib/intel64/libmkl_core.so;m;dl;Threads::Threads
-- CMAKE_Fortran_COMPILER: /cluster/spack/apps/linux-centos7-x86_64/gcc-4.8.5/gcc-8.2.0-6xqov2fhvbmehix42slain67vprec3fs/bin/gfortran
-- CMAKE_Fortran_COMPILER_ID: GNU
-- Using GNU Fortran compiler settings
-- Fortran flags:  -fdefault-integer-8
-- Check if compiler accepts -pthread
-- Check if compiler accepts -pthread - yes
-- Performing Test BLAS_1_SIGNATURE
-- Performing Test BLAS_1_SIGNATURE - Success
-- Performing Test BLAS_2_SIGNATURE
-- Performing Test BLAS_2_SIGNATURE - Success
-- Performing Test BLAS_3_SIGNATURE
-- Performing Test BLAS_3_SIGNATURE - Failed
-- Performing Test BLAS_4_SIGNATURE
-- Performing Test BLAS_4_SIGNATURE - Success
-- Performing Test BLAS_5_SIGNATURE
-- Performing Test BLAS_5_SIGNATURE - Failed
-- Performing Test BLAS_6_SIGNATURE
-- Performing Test BLAS_6_SIGNATURE - Failed
-- Performing Test FUNCTION_1_SIGNATURE
-- Performing Test FUNCTION_1_SIGNATURE - Success
-- Performing Test FUNCTION_2_SIGNATURE
-- Performing Test FUNCTION_2_SIGNATURE - Success
-- Looking for sched_setaffinity
-- Looking for sched_setaffinity - found
-- Looking for pthread_setaffinity_np
-- Looking for pthread_setaffinity_np - found
-- Configuring done
-- Generating done
-- Build files have been written to: /scratch/tmp.21408580.sfux/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build
ajaypanyala commented 1 year ago

The issue308 branch only adds a print for debugging and is based off the latest code. It looks like OpenMolcas is using a GA commit from March 2021 as shown here. There have been several significant changes to the CMake build of GA since then. I would recommend updating the GA_HASH to a more recent commit 3cb97b6

samfux84 commented 1 year ago

@ajaypanyala Thank you for the update. I was already assuming that OpenMolcas uses an older commit of GA. I will try to change the OpenMolcas routines to use 3cb97b3 instead of the one that is currently used. This won't be trivial as I already changed the commit in the respective CMake files (when testing the issue308 branch), but the build then still fails, because OpenMolcas tries to apply a patch on some GA CMake files, which fails when I use a different branch.

I will try to hack the OpenMolcas code to make it possible to use a different branch.

Thank you very much for the good support.

Best regards

Sam

ajaypanyala commented 5 months ago

Closing due to inactivity. Please feel free to re-open if needed.