Closed usefulhyun closed 7 years ago
Why do you want to use GCC? The default on macOS is Clang. I've just tried on my machine at seems to work. Please give the precise error message.
macOS default gcc calls clang, so I mentioned gcc and it does not support openmp.
The error message is below.
LoadError: failed process: Process(cmake -D CMAKE_INSTALL_PREFIX=/Users/usefulhyun/.julia/v0.5/Elemental/deps/usr -D INSTALL_PYTHON_PACKAGE=OFF -D PYTHON_EXECUTABLE= -D PYTHON_SITE_PACKAGES= -D EL_USE_64BIT_INTS=ON -D EL_USE_64BIT_BLAS_INTS=ON -D MATH_LIBS=/Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/libopenblas64_.dylib -D EL_BLAS_SUFFIX=_64_ -D EL_LAPACK_SUFFIX=_64_ -D CMAKE_INSTALL_RPATH=/Users/usefulhyun/.julia/v0.5/Elemental/deps/usr/lib /Users/usefulhyun/.julia/v0.5/Elemental/deps/src/Elemental
, ProcessExited(1)) [1]
while loading /Users/usefulhyun/.julia/v0.5/Elemental/deps/build.jl, in expression starting on line 56
Do you have CMake installed?
Yes, I installed CMake, I tried the same installation way on Ubuntu 14.04, and it works. My cmake version is 3.7.2.
Is there more output from the failed build?
Here is the entire build log.
INFO: Building Elemental -- The C compiler identification is AppleClang 8.0.0.8000042 -- The CXX compiler identification is AppleClang 8.0.0.8000042 -- Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/cc -- Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/cc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Detecting C compile features -- Detecting C compile features - done -- Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/c++ -- Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/c++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Detecting CXX compile features -- Detecting CXX compile features - done -- Could NOT find PythonInterp (missing: PYTHON_EXECUTABLE) -- Appending /Users/usefulhyun/.julia/v0.5/Elemental/deps/src/Elemental/include for Elemental's source includes -- Appending /Users/usefulhyun/.julia/v0.5/Elemental/deps/builds/include for Elemental's binary includes CMake Warning at CMakeLists.txt:250 (message): Build mode not specified, defaulting to Release build.
-- The Fortran compiler identification is GNU 6.3.0
-- Checking whether Fortran compiler has -isysroot
-- Checking whether Fortran compiler has -isysroot - yes
-- Checking whether Fortran compiler supports OSX deployment target flag
-- Checking whether Fortran compiler supports OSX deployment target flag - yes
-- Check for working Fortran compiler: /usr/local/bin/gfortran
-- Check for working Fortran compiler: /usr/local/bin/gfortran -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /usr/local/bin/gfortran supports Fortran 90
-- Checking whether /usr/local/bin/gfortran supports Fortran 90 -- yes
-- Detecting Fortran/C Interface
-- Detecting Fortran/C Interface - Found GLOBAL and MODULE mangling
-- Verifying Fortran/CXX Compiler Compatibility
-- Verifying Fortran/CXX Compiler Compatibility - Success
-- Performing Test _HAS_CXX11_FLAG
-- Performing Test _HAS_CXX11_FLAG - Success
-- Checking C++ support for "auto"
-- Checking C++ support for "auto": works
-- Checking C++ support for "class_override_final"
-- Checking C++ support for "class_override_final": works
-- Checking C++ support for "constexpr"
-- Checking C++ support for "constexpr": works
-- Checking C++ support for "cstdint_header"
-- Checking C++ support for "cstdint_header": works
-- Checking C++ support for "decltype"
-- Checking C++ support for "decltype": works
-- Checking C++ support for "defaulted_functions"
-- Checking C++ support for "defaulted_functions": works
-- Checking C++ support for "delegating_constructors"
-- Checking C++ support for "delegating_constructors": works
-- Checking C++ support for "deleted_functions"
-- Checking C++ support for "deleted_functions": works
-- Checking C++ support for "func_identifier"
-- Checking C++ support for "func_identifier": works
-- Checking C++ support for "initializer_list"
-- Checking C++ support for "initializer_list": works
-- Checking C++ support for "lambda"
-- Checking C++ support for "lambda": works
-- Checking C++ support for "long_long"
-- Checking C++ support for "long_long": works
-- Checking C++ support for "nullptr"
-- Checking C++ support for "nullptr": works
-- Checking C++ support for "rvalue_references"
-- Checking C++ support for "rvalue_references": works
-- Checking C++ support for "sizeof_member"
-- Checking C++ support for "sizeof_member": works
-- Checking C++ support for "static_assert"
-- Checking C++ support for "static_assert": works
-- Checking C++ support for "variadic_templates"
-- Checking C++ support for "variadic_templates": works
-- Found CXXFeatures: TRUE
-- CXX11_COMPILER_FLAGS=-std=c++11
-- Performing Test EL_HAVE_TEMPLATE_ALIAS
-- Performing Test EL_HAVE_TEMPLATE_ALIAS - Success
-- Performing Test EL_HAVE_STEADYCLOCK
-- Performing Test EL_HAVE_STEADYCLOCK - Success
-- Performing Test EL_HAVE_NOEXCEPT
-- Performing Test EL_HAVE_NOEXCEPT - Success
-- Performing Test EL_HAVE_NORMAL_DIST
-- Performing Test EL_HAVE_NORMAL_DIST - Success
-- Performing Test EL_HAVE_UNIFORM_INT_DIST
-- Performing Test EL_HAVE_UNIFORM_INT_DIST - Success
-- Performing Test EL_HAVE_UNIFORM_REAL_DIST
-- Performing Test EL_HAVE_UNIFORM_REAL_DIST - Success
-- Performing Test HAVE_restrict_
-- Performing Test HAVErestrict - Success
-- Performing Test HAVE__restrict
-- Performing Test HAVErestrict - Success
-- Performing Test HAVE_restrict
-- Performing Test HAVE_restrict - Failed
-- Using restrict keyword.
-- Found MPI_C: /usr/local/Cellar/mpich/3.2_2/lib/libmpi.dylib;/usr/local/Cellar/mpich/3.2_2/lib/libpmpi.dylib
-- Found MPI_CXX: /usr/local/Cellar/mpich/3.2_2/lib/libmpicxx.dylib;/usr/local/Cellar/mpich/3.2_2/lib/libmpi.dylib;/usr/local/Cellar/mpich/3.2_2/lib/libpmpi.dylib
-- Found MPI_Fortran: /usr/local/Cellar/mpich/3.2_2/lib/libmpifort.dylib;/usr/local/Cellar/mpich/3.2_2/lib/libmpi.dylib;/usr/local/Cellar/mpich/3.2_2/lib/libpmpi.dylib
-- Performing Test EL_HAVE_MPI_REDUCE_SCATTER
-- Performing Test EL_HAVE_MPI_REDUCE_SCATTER - Success
-- Will parse MPI header /usr/local/Cellar/mpich/3.2_2/include/mpi.h
-- Performing Test EL_HAVE_MPI_TYPE_CREATE_STRUCT
-- Performing Test EL_HAVE_MPI_TYPE_CREATE_STRUCT - Success
-- Performing Test EL_HAVE_MPI_LONG_LONG
-- Performing Test EL_HAVE_MPI_LONG_LONG - Success
-- Performing Test EL_HAVE_MPI_LONG_DOUBLE
-- Performing Test EL_HAVE_MPI_LONG_DOUBLE - Success
-- Performing Test EL_HAVE_MPI_LONG_DOUBLE_COMPLEX
-- Performing Test EL_HAVE_MPI_LONG_DOUBLE_COMPLEX - Failed
-- Performing Test EL_HAVE_MPI_C_COMPLEX
-- Performing Test EL_HAVE_MPI_C_COMPLEX - Success
-- Performing Test EL_HAVE_MPI_REDUCE_SCATTER_BLOCK
-- Performing Test EL_HAVE_MPI_REDUCE_SCATTER_BLOCK - Success
-- Performing Test EL_HAVE_MPI3_NONBLOCKING_COLLECTIVES
-- Performing Test EL_HAVE_MPI3_NONBLOCKING_COLLECTIVES - Success
-- Performing Test EL_HAVE_MPIX_NONBLOCKING_COLLECTIVES
-- Performing Test EL_HAVE_MPIX_NONBLOCKING_COLLECTIVES - Failed
-- Performing Test EL_HAVE_MPI_INIT_THREAD
-- Performing Test EL_HAVE_MPI_INIT_THREAD - Success
-- Performing Test EL_HAVE_MPI_QUERY_THREAD
-- Performing Test EL_HAVE_MPI_QUERY_THREAD - Success
-- Performing Test EL_HAVE_MPI_COMM_SET_ERRHANDLER
-- Performing Test EL_HAVE_MPI_COMM_SET_ERRHANDLER - Success
-- Performing Test EL_HAVE_MPI_IN_PLACE
-- Performing Test EL_HAVE_MPI_IN_PLACE - Success
-- Performing Test EL_HAVE_MPI_COMM_F2C
-- Performing Test EL_HAVE_MPI_COMM_F2C - Success
-- Performing Test EL_MPI_COMM_NOT_INT
-- Performing Test EL_MPI_COMM_NOT_INT - Failed
-- Performing Test EL_MPI_GROUP_NOT_INT
-- Performing Test EL_MPI_GROUP_NOT_INT - Failed
-- Appending /usr/local/Cellar/mpich/3.2_2/include for MPI headers
-- Try OpenMP C flag = [-fopenmp=libomp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP C flag = [ ]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP C flag = [-fopenmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP C flag = [/openmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP C flag = [-Qopenmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP C flag = [-openmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP C flag = [-xopenmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP C flag = [+Oopenmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP C flag = [-qsmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP C flag = [-mp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [-fopenmp=libomp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [ ]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [-fopenmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [/openmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [-Qopenmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [-openmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [-xopenmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [+Oopenmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [-qsmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP CXX flag = [-mp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Failed
-- Try OpenMP Fortran flag = [-fopenmp]
-- Performing Test OpenMP_FLAG_DETECTED
-- Performing Test OpenMP_FLAG_DETECTED - Success
-- Could NOT find OpenMP (missing: OpenMP_C_FLAGS OpenMP_CXX_FLAGS)
-- Valgrind Prefix:
-- Could NOT find VALGRIND (missing: VALGRIND_INCLUDE_DIR VALGRIND_PROGRAM)
-- Will attempt to extend user-defined MATHLIBS=/Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/libopenblas64.dylib
-- Looking for mkl_dcsrmv
-- Looking for mkl_dcsrmv - not found
-- Looking for daxpy64
-- Looking for daxpy64 - not found
CMake Error at cmake/external_projects/ElMath.cmake:292 (message):
daxpy64 was not detected
Call Stack (most recent call first):
CMakeLists.txt:358 (include)
-- Configuring incomplete, errors occurred! See also "/Users/usefulhyun/.julia/v0.5/Elemental/deps/builds/CMakeFiles/CMakeOutput.log". See also "/Users/usefulhyun/.julia/v0.5/Elemental/deps/builds/CMakeFiles/CMakeError.log".
The failure is not because of OpenMP but because CMake can't find a specific symbol in OpenBLAS. I just tested locally with the downloaded binary and I don't see the error. What do you get from
nm /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/libopenblas64_.dylib | grep daxpy
What I get from the commend above
000000000001afb0 T _cblas_daxpy64_
000000000001a080 T _daxpy_64_
00000000010a4ea0 t _daxpy_k_ATOM
0000000000ddeee0 t _daxpy_k_BARCELONA
0000000000f462c0 t _daxpy_k_BOBCAT
0000000001980000 t _daxpy_k_BULLDOZER
0000000000536b00 t _daxpy_k_CORE2
0000000000760180 t _daxpy_k_DUNNINGTON
000000000314a650 t _daxpy_k_EXCAVATOR
0000000003300000 t _daxpy_k_HASWELL
00000000011955c0 t _daxpy_k_NANO
0000000000930000 t _daxpy_k_NEHALEM
0000000000aeb9c0 t _daxpy_k_OPTERON
0000000000c651e0 t _daxpy_k_OPTERON_SSE3
000000000064d300 t _daxpy_k_PENRYN
00000000021a0000 t _daxpy_k_PILEDRIVER
00000000003ad7e0 t _daxpy_k_PRESCOTT
00000000013b0000 t _daxpy_k_SANDYBRIDGE
0000000002a10000 t _daxpy_k_STEAMROLLER
0000000000930260 t _daxpy_kernel_8
00000000013b0260 t _daxpy_kernel_8
0000000001980260 t _daxpy_kernel_8
00000000021a0260 t _daxpy_kernel_8
0000000002a10260 t _daxpy_kernel_8
0000000003300260 t _daxpy_kernel_8
And how do you get your Julia on mac os ?
I tried with the downloaded app from julialang.org. Isn't that the one you use?
Please also share the content of /Users/usefulhyun/.julia/v0.5/Elemental/deps/builds/CMakeFiles/CMake*.log
I got julia from brew's Caskroom/cask/julia
@usefulhyun it looks like your MPI compiler isn't finding libgfortran
properly. Did you perhaps uninstall gcc
recently? It looks like your mpich
installation is from Homebrew, can you remove it (brew rm mpich
) and then add it back again (brew install mpich
) and see if the issue persists?
If you remove gcc
, you will remove the libgfortran
that your mpich
installation needs, since gfortran
comes as part of gcc
.
@staticfloat I didn't uninstall it, I tried removing and reinstalling mpich, but it still does't work.
I tested with julia installed via brew install staticfloat/julia/julia
and it works. However, using brew cask install julia
does not work. I'm debugging now, but if you just want this to work, you can try removing the cask-installed Julia and using the version installed from my Homebrew tap.
I'm unable to look into this any more tonight. There's something going wrong with the Cask version of Julia that doesn't happen with either the pure homebrew tap, or the official binary. You should probably report this to the homebrew Cask people. :(
@staticfloat Thank you, I am trying building staticfloat/julia, and it already passed where I got stuck before. It's problem with the homebrew Cask. And thank you @andreasnoack also!.
P.S. I finally installed on my OSX. I will report this to the homebrew Cask.
The cask binary should be exactly the same as the dmg build, just extracted to a particular location. Do they do anything more complicated to the folder structure? I've been thinking we should deprecate the staticfloat/julia tap for releases in favor of cask, since the latter won't get its dependencies changed out from underneath it.
Because gcc of OSX doesn't support openmp, I failed to install this package. I have tried installing with "brew install gcc and made an alias gcc='gcc-6'". But package installer still doesn't work.