uwsampa / grappa

Grappa: scaling irregular applications on commodity clusters
grappa.io
BSD 3-Clause "New" or "Revised" License
157 stars 51 forks source link

Build problem with MPI library #198

Open rfvander opened 9 years ago

rfvander commented 9 years ago

I downloaded Grappa and am now trying to build it, but instructions are a bit sparse. If I define symbols CC and CXX to resolve to the Intel compilers icc and icpc, respectively, I get the error message below. Obviously, my installed MPI cannot be found. I tried to fix that by setting: “export MPICC=mpiicc” but that did not work, nor did “export MPI_C=mpiicc”. There is no reference to MPI in “configure” or in “FindPackageHandleStandardArgs.cmake “. Do you have any suggestions? By the way, I also have GASNet installed, so if that is the better communication layer, I'll use that--if I can get some instructions how to do that. Thanks.

Rob

[rfvander@bar1 grappa]$ export CC=icc [rfvander@bar1 grappa]$ export CXX=icpc [rfvander@bar1 grappa]$ ./configure --gen=Make --mode=Release cmake /lustre/home/rfvander/grappa -G"Unix Makefiles" -DSHMMAX=33554432 -DCMAKE_C_COMPILER=icc -DCMAKE_CXX_COMPILER=icpc -DBASE_C_COMPILER=icc -DBASE_CXX_COMPILER=icpc -DCMAKE_BUILD_TYPE=RelWithDebInfo -- The C compiler identification is Intel 15.0.0.20140723 -- The CXX compiler identification is Intel 15.0.0.20140723 -- Check for working C compiler: /opt/intel/tools/composer_xe_2015.0.090/bin/intel64/icc -- Check for working C compiler: /opt/intel/tools/composer_xe_2015.0.090/bin/intel64/icc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working CXX compiler: /opt/intel/tools/composer_xe_2015.0.090/bin/intel64/icpc -- Check for working CXX compiler: /opt/intel/tools/composer_xe_2015.0.090/bin/intel64/icpc -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Boost found: 1.53.0 -- /usr CMake Error at /usr/share/cmake/Modules/ FindPackageHandleStandardArgs.cmake:108 (message): Could NOT find MPI_C (missing: MPI_C_LIBRARIES) Call Stack (most recent call first): /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE) /usr/share/cmake/Modules/FindMPI.cmake:587 (find_package_handle_standard_args) CMakeLists.txt:205 (find_package)

-- Configuring incomplete, errors occurred! See also "/lustre/home/rfvander/grappa/build/Make+Release/CMakeFiles/CMakeOutput.log".

nelsonje commented 9 years ago

Hi Rob,

We currently support only GCC or LLVM/Clang, and OpenMPI or MVAPICH2. Intel MPI will probably work as long as it supports MPI3 non-blocking collectives, but we'll need to update our configure script to identify the correct libraries to link against (it doesn't use the mpicc/mpicxx wrappers). I'm not sure about the Intel compiler---we depend on some GCC directives which it may or may not support.

We no longer support GASNet.

If you could get us an account on a machine with Intel compiler/MPI we could probably get these things debugged.

bholt commented 9 years ago

Jacob's right, we don't use the Intel compiler, so can't make any promises.

CMake ships with some pretty weak scripts to try to find libraries like MPI. It looks like you might have better luck with the HDF FindMPI.cmake, which seems to know about Intel MPI. I think to use it, you'd have to download that FindMPI.cmake file, and add the path to it to CMAKE_MODULE_PATH (it might work to put it in environment variable, or you can pass it to CMake through our configure script:

./configure ... -- -DCMAKE_MODULE_PATH=/path/to/new/findmpi
rfvander commented 9 years ago

Thanks, Jacob. I don’t mind building with GCC initially, but will want to use Intel compilers when in the future. That’s kind of a nice thing to do when you’re an Intel employee ☺.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Monday, December 15, 2014 4:20 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Hi Rob,

We currently support only GCC or LLVM/Clang, and OpenMPI or MVAPICH2. Intel MPI will probably work as long as it supports MPI3 non-blocking collectives, but we'll need to update our configure script to identify the correct libraries to link against (it doesn't use the mpicc/mpicxx wrappers). I'm not sure about the Intel compiler---we depend on some GCC directives which it may or may not support.

We no longer support GASNet.

If you could get us an account on a machine with Intel compiler/MPI we could probably get these things debugged.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-67092093.

rfvander commented 9 years ago

Thanks, Brandon. I’ll try that HDF solution. Will let you know how I fare.

Rob

From: Brandon Holt [mailto:notifications@github.com] Sent: Monday, December 15, 2014 4:27 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Jacob's right, we don't use the Intel compiler, so can't make any promises.

CMake ships with some pretty weak scripts to try to find libraries like MPI. It looks like you might have better luck with the HDF FindMPI.cmakehttp://www.hdfgroup.org/ftp/HDF5/hdf-java/hdf-java-examples/config/cmake/FindMPI.cmake, which seems to know about Intel MPI. I think to use it, you'd have to download that FindMPI.cmake file, and add the path to it to CMAKE_MODULE_PATH (it might work to put it in environment variable, or you can pass it to CMake through our configure script:

./configure ... -- -DCMAKE_MODULE_PATH=/path/to/new/findmpi

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-67092770.

nelsonje commented 9 years ago

Great! As I said, we'd be happy to add explicit support for the Intel compiler and MPI; we just need a way to test it.

rfvander commented 9 years ago

There are multiple ways to do the testing with Intel tools, Jacob. The easiest would be to obtain a free trial licensehttps://software.intel.com/en-us/articles/try-buy-tools, valid for 30 days. Next best would be to sit together (virtually) to figure things out. Getting access to a machine we own is also possible, but more work. Let’s try that if the other two options do not work.

From: Jacob Nelson [mailto:notifications@github.com] Sent: Monday, December 15, 2014 4:51 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Great! As I said, we'd be happy to add explicit support for the Intel compiler and MPI; we just need a way to test it.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-67095081.

rfvander commented 9 years ago

No success with downloading HDF's FindMPI.cmake. I put it in $PWD/CMakeFiles and then defined an environment variable: "export CMAKE_MODULE_PATH=$PWD/CMakeFiles" Running the same configure command as before gave the exact same output as before. Evidently, the standard FindMPI.cmake is used, not the new one. Explicitly setting the CMAKE_MODULE_PATH on the command line did not work either. I tried two variations, with and without prefix "D": [rfvander@bar1 grappa]$ ./configure --gen=Make --mode=Release -DCMAKE_MODULE_PATH=$CMAKE_MODULE_PATH ./configure:35:in <main>': invalid option: -DCMAKE_MODULE_PATH=/home/rfvander/grappa/CMakeFiles (OptionParser::InvalidOption) [rfvander@bar1 grappa]$ ./configure --gen=Make --mode=Release -CMAKE_MODULE_PATH=$CMAKE_MODULE_PATH ./configure:35:in

': invalid option: -CMAKE_MODULE_PATH=/home/rfvander/grappa/CMakeFiles (OptionParser::InvalidOption)

rfvander commented 9 years ago

I also tried with double leading hyphen --DCMAKE_MODULE_PATH and --CMAKE_MODULE_PATH, to no avail.

bholt commented 9 years ago

Sometimes with CMake you also have to make sure you delete the build directory it's operating on before trying changes like this. Sorry I can't be more helpful than that right now. I'll see if I can find some time to test it out myself later.

rfvander commented 9 years ago

Thanks, Brandon. Yes, I delete build each time it gets created before I do a new test--most of the time configure chokes before it gets to create build, though :). Please let me know if it would help to debug this issue via desktop sharing. I did find, by looking inside configure, that there are no standard options dealing with Cmake paths.

bholt commented 9 years ago

Okay good. Yeah I mean the idea here is that this line should be passing all the extra args through to the CMake command, so that's why I think they have to go after the --.

rfvander commented 9 years ago

Right, and that is why I also tried it with the preceding "--" but have not got that to work yet.

rfvander commented 9 years ago

Could you take another look at the findMPI issue I have been experiencing? The only thing I need is a way to influence the priority of the paths Cmake uses for its search files. I would really like to get over this hump. You should be able to test this with any Cmake find file, doesn’t have to be MPI. I’ll be out of the office next week, but want to hit the ground running when I return. Thanks!

Rob

From: Brandon Holt [mailto:notifications@github.com] Sent: Thursday, December 18, 2014 11:07 AM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Okay good. Yeah I mean the idea here is that this linehttps://github.com/uwsampa/grappa/blob/master/configure#L146 should be passing all the extra args through to the CMake command, so that's why I think they have to go after the --.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-67538442.

nelsonje commented 9 years ago

The best thing to do is for us to install the intel tools here and debug---I expect as soon as we fix the MPI discovery problem we'll run into compiler flag problems. I should have time to get them installed today or Monday, and then Brandon and I can take a look.

rfvander commented 9 years ago

Splendid, thanks, Jacob. When you get to the Intel web site, it’ll be easiest to request a trial version of the cluster software quite, which includes the compilers and MPI, and ITAC (Intel Trace Analyzer and Collector). If you need the tools for more than 30 days, let me work on this side to get you a truly free version, not just the temporary version—can’t promise I’ll succeed, though.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Friday, December 19, 2014 12:31 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

The best thing to do is for us to install the intel tools here and debug---I expect as soon as we fix the MPI discovery problem we'll run into compiler flag problems. I should have time to get them installed today or Monday, and then Brandon and I can take a look.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-67693141.

bholt commented 9 years ago

We certainly wouldn't turn away a license. If we're going to try to maintain compatibility then we'll need more than 30 days.

Surprisingly enough, I'll also be out of commission the next couple weeks. I'll see if I can find time to look into the CMake thing this afternoon, but it's something that's not Grappa-specific at all, so other forums may have the answer.

On Dec 19, 2014, at 12:35 PM, rfvander notifications@github.com wrote:

Splendid, thanks, Jacob. When you get to the Intel web site, it’ll be easiest to request a trial version of the cluster software quite, which includes the compilers and MPI, and ITAC (Intel Trace Analyzer and Collector). If you need the tools for more than 30 days, let me work on this side to get you a truly free version, not just the temporary version—can’t promise I’ll succeed, though.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Friday, December 19, 2014 12:31 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

The best thing to do is for us to install the intel tools here and debug---I expect as soon as we fix the MPI discovery problem we'll run into compiler flag problems. I should have time to get them installed today or Monday, and then Brandon and I can take a look.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-67693141. — Reply to this email directly or view it on GitHub.

nelsonje commented 9 years ago

I've made some progress with this. I installed the eval version of (I think this is the right name) Intel Parallel Studio XE Cluster Edition 2015 Update 1 on our cluster, and managed to get Grappa code to run with Intel MPI, but not with the Intel compiler. More details:

Intel MPI with GCC 4.8

This just worked once I sourced the Intel MPI variable file, even without changing the MPI detection code in the configure script. I verified this by looking at the libraries listed in the Found MPI_C and Found MPI_CXX lines printed by the configure script. I did have to set a few environment variables to get MPI to work with our cluster's scheduler and network:

source  /sampa/share/intel-cluster-studio/impi_latest/bin64/mpivars.sh
export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi.so   # to support Slurm srun
export I_MPI_FABRICS='shm:ofa'                  #  supported fabric list on our cluster

After this I was able to configure, make and run (with grappa_run) some of the demo programs. I didn't look into performance at this point.

Intel MPI with Intel compiler

I ran into both a compilation error, and once I worked around that, a segfault. The source of the segfault is not yet clear. it's showing up in a place where a segfault shouldn't be possible, but it's in the worker spawn code, and I know our stack switching code has confused other compilers in the past. I'll have to spend some more time digging into this.

I got the configure script to pick up the Intel compiler by doing this:

source /sampa/share/intel-cluster-studio/bin/iccvars.sh intel64
export CC=`which icc`
export CXX=`which icpc`
rfvander commented 9 years ago

That is great progress, Jacob. I’ll try to duplicate your effort on my cluster. Will keep you posted. How fun that you accomplished something impossible.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Monday, December 29, 2014 11:12 AM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

I've made some progress with this. I installed the eval version of (I think this is the right name) Intel Parallel Studio XE Cluster Edition 2015 Update 1 on our cluster, and managed to get Grappa code to run with Intel MPI, but not with the Intel compiler. More details:

Intel MPI with GCC 4.8

This just worked once I sourced the Intel MPI variable file, even without changing the MPI detection code in the configure script. I verified this by looking at the libraries listed in the Found MPI_C and Found MPI_CXX lines printed by the configure script. I did have to set a few environment variables to get MPI to work with our cluster's scheduler and network:

source /sampa/share/intel-cluster-studio/impi_latest/bin64/mpivars.sh

export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi.so # to support Slurm srun

export I_MPI_FABRICS='shm:ofa' # supported fabric list on our cluster

After this I was able to configure, make and run (with grappa_run) some of the demo programs. I didn't look into performance at this point.

Intel MPI with Intel compiler

I ran into both a compilation error, and once I worked around that, a segfault. The source of the segfault is not yet clear. it's showing up in a place where a segfault shouldn't be possible, but it's in the worker spawn code, and I know our stack switching code has confused other compilers in the past. I'll have to spend some more time digging into this.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-68289095.

rfvander commented 9 years ago

Hi Jacob,

Unfortunately, what worked for you didn’t for me—I should have known, since I always source my Intel MPI resource files in my login shell. Changing the compilers to gcc/g++ doesn’t affect the search for the MPI library, though I did try without the Intel compilers. Still searching. Happy New Year!

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Monday, December 29, 2014 11:12 AM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

I've made some progress with this. I installed the eval version of (I think this is the right name) Intel Parallel Studio XE Cluster Edition 2015 Update 1 on our cluster, and managed to get Grappa code to run with Intel MPI, but not with the Intel compiler. More details:

Intel MPI with GCC 4.8

This just worked once I sourced the Intel MPI variable file, even without changing the MPI detection code in the configure script. I verified this by looking at the libraries listed in the Found MPI_C and Found MPI_CXX lines printed by the configure script. I did have to set a few environment variables to get MPI to work with our cluster's scheduler and network:

source /sampa/share/intel-cluster-studio/impi_latest/bin64/mpivars.sh

export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi.so # to support Slurm srun

export I_MPI_FABRICS='shm:ofa' # supported fabric list on our cluster

After this I was able to configure, make and run (with grappa_run) some of the demo programs. I didn't look into performance at this point.

Intel MPI with Intel compiler

I ran into both a compilation error, and once I worked around that, a segfault. The source of the segfault is not yet clear. it's showing up in a place where a segfault shouldn't be possible, but it's in the worker spawn code, and I know our stack switching code has confused other compilers in the past. I'll have to spend some more time digging into this.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-68289095.

nelsonje commented 9 years ago

Too bad! I'll take a look at the MPI detection code and see if I can offer any hints.

If possible, could you paste in your $PATH and $LD_LIBRARY_PATH variables? Maybe we can identify an ordering problem.

Happy New Year to you too!

rfvander commented 9 years ago

Sure, Jacob:

[rfvander@bar1 bin]$ echo $PATH /opt/intel/tools//impi/5.0.1.035/intel64/bin:/opt/intel/tools/composer_xe_2015.0.090/bin/intel64:/opt/intel/tools/composer_xe_2015.0.090/mpirt/bin/intel64:/opt/intel/tools/composer_xe_2015.0.090/debugger/gdb/intel64_mic/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:.:/home/rfvander/.local/bin:/home/rfvander/bin [rfvander@bar1 bin]$ echo $LD_LIBRARY_PATH /opt/intel/tools//impi/5.0.1.035/intel64/lib:/opt/intel/tools/composer_xe_2015.0.090/compiler/lib/intel64:/opt/intel/tools/composer_xe_2015.0.090/mpirt/lib/intel64:/opt/intel/tools/composer_xe_2015.0.090/ipp/../compiler/lib/intel64:/opt/intel/tools/composer_xe_2015.0.090/ipp/lib/intel64:/opt/intel/tools/composer_xe_2015.0.090/compiler/lib/intel64:/opt/intel/tools/composer_xe_2015.0.090/mkl/lib/intel64:/opt/intel/tools/composer_xe_2015.0.090/tbb/lib/intel64/gcc4.4::/home/rfvander/xstack/ocr/install/x86-pthread-x86/lib

Please promise you won’t work on this tonight ☺.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Wednesday, December 31, 2014 4:45 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Too bad! I'll take a look at the MPI detection code and see if I can offer any hints.

If possible, could you paste in your $PATH and $LD_LIBRARY_PATH variables? Maybe we can identify an ordering problem.

Happy New Year to you too!

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-68476864.

nelsonje commented 9 years ago

Don't worry; going home shortly. :-)

Those variables look fine. Another thing to try: run the configure script with tracing enabled. There will be a bunch of output from the FindMPI.cmake module, and we may identify something there. Send me a copy of the output over email (just in case anything sensitive ends up in the trace). Here's the command I used to do this:

./configure -- --trace 2>&1 | tee cmake-trace.txt
rfvander commented 9 years ago

Thanks, Jacob. I sent the output via email.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Wednesday, December 31, 2014 5:14 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Don't worry; going home shortly. :-)

Those variables look fine. Another thing to try: run the configure script with tracing enabled. There will be a bunch of output from the FindMPI.cmake module, and we may identify something there. Send me a copy of the output over email (just in case anything sensitive ends up in the trace). Here's the command I used to do this:

./configure -- --trace 2>&1 | tee cmake-trace.txt

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-68477563.

nelsonje commented 9 years ago

Rob, try configuring Grappa with a command like this (with the paths fixed for your Intel MPI installation, of course):

./configure -- -DMPI_C_COMPILER=/sampa/share/intel-cluster-studio/impi_latest/bin64/mpigcc -DMPI_CXX_COMPILER=/sampa/share/intel-cluster-studio/impi_latest/bin64/mpigxx
nelsonje commented 9 years ago

If that doesn't work, according to the FindMPI.cmake source, we should be able to set the include/library paths and flags directly like this:

./configure -- -DMPI_CXX_INCLUDE_PATH=(something)  -DMPI_CXX_LINK_FLAGS=(something)  -DMPI_CXX_COMPILE_FLAGS=(something)  -DMPI_CXX_LIBRARIES=(something) 

Relevant docs from FindMPI.cmake:

# === Variables ===
#
# This module will set the following variables per language in your project,
# where <lang> is one of C, CXX, or Fortran:
#   MPI_<lang>_FOUND           TRUE if FindMPI found MPI flags for <lang>
#   MPI_<lang>_COMPILER        MPI Compiler wrapper for <lang>
#   MPI_<lang>_COMPILE_FLAGS   Compilation flags for MPI programs
#   MPI_<lang>_INCLUDE_PATH    Include path(s) for MPI header
#   MPI_<lang>_LINK_FLAGS      Linking flags for MPI programs
#   MPI_<lang>_LIBRARIES       All libraries to link MPI programs against

and

# === Usage ===
#
# To use this module, simply call FindMPI from a CMakeLists.txt file, or
# run find_package(MPI), then run CMake.  If you are happy with the auto-
# detected configuration for your language, then you're done.  If not, you
# have two options:
#   1. Set MPI_<lang>_COMPILER to the MPI wrapper (mpicc, etc.) of your
#      choice and reconfigure.  FindMPI will attempt to determine all the
#      necessary variables using THAT compiler's compile and link flags.
#   2. If this fails, or if your MPI implementation does not come with
#      a compiler wrapper, then set both MPI_<lang>_LIBRARIES and
#      MPI_<lang>_INCLUDE_PATH.  You may also set any other variables
#      listed above, but these two are required.  This will circumvent
#      autodetection entirely.
# When configuration is successful, MPI_<lang>_COMPILER will be set to the
# compiler wrapper for <lang>, if it was found.  MPI_<lang>_FOUND and other
# variables above will be set if any MPI implementation was found for <lang>,
# regardless of whether a compiler was found.
nelsonje commented 9 years ago

For further reference, when I configure using GCC48 and the Intel MPI, these are what the CMake MPI_* variables discussed above are set to:

MPI_CXX_COMPILER='/sampa/share/intel-cluster-studio/impi_latest/bin64/mpigxx'
MPI_CXX_COMPILE_FLAGS=''
MPI_CXX_INCLUDE_PATH='/sampa/share/intel-cluster-studio/impi/5.0.2.044/intel64/include'
MPI_CXX_LINK_FLAGS=''
MPI_CXX_LIBRARIES='/sampa/share/intel-cluster-studio/impi/5.0.2.044/intel64/lib/libmpicxx.so;/sampa/share/intel-cluster-studio/impi/5.0.2.044/intel64/lib/libmpifort.so;/sampa/share/intel-cluster-studio/impi/5.0.2.044/intel64/lib/release_mt/libmpi.so;/sampa/share/intel-cluster-studio/impi/5.0.2.044/intel64/lib/libmpigi.a;/usr/lib64/libdl.so;/usr/lib64/librt.so;/usr/lib64/libpthread.so'

and for C (not sure if these are important for our build):

MPI_C_COMPILER='/sampa/share/intel-cluster-studio/impi_latest/bin64/mpigcc'
MPI_C_COMPILE_FLAGS=''
MPI_C_INCLUDE_PATH='/sampa/share/intel-cluster-studio/impi/5.0.2.044/intel64/include'
MPI_C_LINK_FLAGS=''
MPI_C_LIBRARIES='/sampa/share/intel-cluster-studio/impi/5.0.2.044/intel64/lib/libmpifort.so;/sampa/share/intel-cluster-studio/impi/5.0.2.044/intel64/lib/release_mt/libmpi.so;/sampa/share/intel-cluster-studio/impi/5.0.2.044/intel64/lib/libmpigi.a;/usr/lib64/libdl.so;/usr/lib64/librt.so;/usr/lib64/libpthread.so'

The FindMPI code discovers these by running mpigxx -show; if you can't get CMake to identify these paths automatically, you can run a command like that and translate into CMake's format.

nelsonje commented 9 years ago

I should also point out that in my bash environment I have the environment variables CC=gcc and CXX=g++ set. You can also set these with a single option to our configure script, like this:

./configure --cc=/sampa/share/gcc-4.8.2/rtf/bin/gcc -- -DMPI_C_COMPILER=/sampa/share/intel-cluster-studio/impi_latest/bin64/mpigcc -DMPI_CXX_COMPILER=/sampa/share/intel-cluster-studio/impi_latest/bin64/mpigxx
rfvander commented 9 years ago

New issue, configure chokes on -cc: [rfvander@bar1 grappa]$ ./configure -- --cc=/usr/bin/gcc -DMPI_C_COMPILER=mpigcc -DMPI_CXX_COMPILER=mpigxx
Must specify C compiler (either use '--cc=' flag, or set the environment variables CC & CXX hint: if the compiler you want is on your PATH, you can do: --cc=$(which gcc)...

rfvander commented 9 years ago

Hi Jacob,

Sadly, none of this worked. Configure keeps not wanting to recognize MPI_C(XX)_LIBRARIES, even though I define them (I have already exported CC and CXX, and Cmake does find those): [rfvander@bar1 grappa]$ export MPI_CXX_LIBRARIES=/opt/intel/tools/impi/5.0.1.035/lib64 [rfvander@bar1 grappa]$ export MPI_C_LIBRARIES=/opt/intel/tools/impi/5.0.1.035/lib64 [rfvander@bar1 grappa]$ \rm -rf build [rfvander@bar1 grappa]$ ./configure cmake /lustre/home/rfvander/grappa -G"Unix Makefiles" -DSHMMAX=33554432 -DCMAKE_C_COMPILER=gcc -DCMAKE_CXX_COMPILER=g++ -DBASE_C_COMPILER=gcc -DBASE_CXX_COMPILER=g++ -DCMAKE_BUILD_TYPE=RelWithDebInfo -- The C compiler identification is GNU 4.8.3 -- The CXX compiler identification is GNU 4.8.3 -- Check for working C compiler: /usr/bin/gcc -- Check for working C compiler: /usr/bin/gcc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working CXX compiler: /usr/bin/g++ -- Check for working CXX compiler: /usr/bin/g++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Boost found: 1.53.0 -- /usr CMake Error at /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:108 (message): Could NOT find MPI_C (missing: MPI_C_LIBRARIES) Call Stack (most recent call first): /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE) /usr/share/cmake/Modules/FindMPI.cmake:587 (find_package_handle_standard_args) CMakeLists.txt:205 (find_package)

-- Configuring incomplete, errors occurred! See also "/lustre/home/rfvander/grappa/build/Make+Release/CMakeFiles/CMakeOutput.log". [

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 12:56 AM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

If that doesn't work, according to the FindMPI.cmake source, we should be able to set the include/library paths and flags directly like this:

./configure -- -DMPI_CXX_INCLUDE_PATH=(something) -DMPI_CXX_LINK_FLAGS=(something) -DMPI_CXX_COMPILE_FLAGS=(something) -DMPI_CXX_LIBRARIES=(something)

Relevant docs from FindMPI.cmake:

=== Variables ===

#

This module will set the following variables per language in your project,

where is one of C, CXX, or Fortran:

MPI__FOUND TRUE if FindMPI found MPI flags for

MPI__COMPILER MPI Compiler wrapper for

MPI__COMPILE_FLAGS Compilation flags for MPI programs

MPI__INCLUDE_PATH Include path(s) for MPI header

MPI__LINK_FLAGS Linking flags for MPI programs

MPI__LIBRARIES All libraries to link MPI programs against

and

=== Usage ===

#

To use this module, simply call FindMPI from a CMakeLists.txt file, or

run find_package(MPI), then run CMake. If you are happy with the auto-

detected configuration for your language, then you're done. If not, you

have two options:

1. Set MPI__COMPILER to the MPI wrapper (mpicc, etc.) of your

choice and reconfigure. FindMPI will attempt to determine all the

necessary variables using THAT compiler's compile and link flags.

2. If this fails, or if your MPI implementation does not come with

a compiler wrapper, then set both MPI__LIBRARIES and

MPI__INCLUDE_PATH. You may also set any other variables

listed above, but these two are required. This will circumvent

autodetection entirely.

When configuration is successful, MPI__COMPILER will be set to the

compiler wrapper for , if it was found. MPI__FOUND and other

variables above will be set if any MPI implementation was found for ,

regardless of whether a compiler was found.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69151893.

rfvander commented 9 years ago

And this is what happens if I put things on the command line (mpigcc and mpigxx are in my path): [rfvander@bar1 grappa]$ ./configure -- -DMPI_C_COMPILER=mpigcc -DMPI_CXX_COMPILER=mpigxx cmake /lustre/home/rfvander/grappa -G"Unix Makefiles" -DSHMMAX=33554432 -DCMAKE_C_COMPILER=gcc -DCMAKE_CXX_COMPILER=g++ -DBASE_C_COMPILER=gcc -DBASE_CXX_COMPILER=g++ -DCMAKE_BUILD_TYPE=RelWithDebInfo -DMPI_C_COMPILER=mpigcc -DMPI_CXX_COMPILER=mpigxx -- The C compiler identification is GNU 4.8.3 -- The CXX compiler identification is GNU 4.8.3 -- Check for working C compiler: /usr/bin/gcc -- Check for working C compiler: /usr/bin/gcc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working CXX compiler: /usr/bin/g++ -- Check for working CXX compiler: /usr/bin/g++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Boost found: 1.53.0 -- /usr CMake Error at /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:108 (message): Could NOT find MPI_C (missing: MPI_C_LIBRARIES) Call Stack (most recent call first): /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE) /usr/share/cmake/Modules/FindMPI.cmake:587 (find_package_handle_standard_args) CMakeLists.txt:205 (find_package)

-- Configuring incomplete, errors occurred! See also "/lustre/home/rfvander/grappa/build/Make+Release/CMakeFiles/CMakeOutput.log".

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 1:24 AM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

I should also point out that in my bash environment I have the environment variables CC=gcc and CXX=g++ set. You can also set these with a single option to our configure script, like this:

./configure --cc=/sampa/share/gcc-4.8.2/rtf/bin/gcc -- -DMPI_C_COMPILER=/sampa/share/intel-cluster-studio/impi_latest/bin64/mpigcc -DMPI_CXX_COMPILER=/sampa/share/intel-cluster-studio/impi_latest/bin64/mpigxx

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69154718.

rfvander commented 9 years ago

Hi Jacob,

Can we consider chucking Cmake? If I have such tremendous trouble building on a pretty vanilla system, (almost) everybody else will struggle, too, and the bandwidth of the Grappa team will be strained. I’ll be happy for now with some sample makefiles you’ve generated and finally getting some results using Grappa. Thanks.

Rob

From: Van Der Wijngaart, Rob F Sent: Thursday, January 08, 2015 12:37 PM To: 'uwsampa/grappa' Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob,

Sadly, none of this worked. Configure keeps not wanting to recognize MPI_C(XX)_LIBRARIES, even though I define them (I have already exported CC and CXX, and Cmake does find those): [rfvander@bar1 grappa]$ export MPI_CXX_LIBRARIES=/opt/intel/tools/impi/5.0.1.035/lib64 [rfvander@bar1 grappa]$ export MPI_C_LIBRARIES=/opt/intel/tools/impi/5.0.1.035/lib64 [rfvander@bar1 grappa]$ \rm -rf build [rfvander@bar1 grappa]$ ./configure cmake /lustre/home/rfvander/grappa -G"Unix Makefiles" -DSHMMAX=33554432 -DCMAKE_C_COMPILER=gcc -DCMAKE_CXX_COMPILER=g++ -DBASE_C_COMPILER=gcc -DBASE_CXX_COMPILER=g++ -DCMAKE_BUILD_TYPE=RelWithDebInfo -- The C compiler identification is GNU 4.8.3 -- The CXX compiler identification is GNU 4.8.3 -- Check for working C compiler: /usr/bin/gcc -- Check for working C compiler: /usr/bin/gcc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working CXX compiler: /usr/bin/g++ -- Check for working CXX compiler: /usr/bin/g++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Boost found: 1.53.0 -- /usr CMake Error at /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:108 (message): Could NOT find MPI_C (missing: MPI_C_LIBRARIES) Call Stack (most recent call first): /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE) /usr/share/cmake/Modules/FindMPI.cmake:587 (find_package_handle_standard_args) CMakeLists.txt:205 (find_package)

-- Configuring incomplete, errors occurred! See also "/lustre/home/rfvander/grappa/build/Make+Release/CMakeFiles/CMakeOutput.log". [

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 12:56 AM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

If that doesn't work, according to the FindMPI.cmake source, we should be able to set the include/library paths and flags directly like this:

./configure -- -DMPI_CXX_INCLUDE_PATH=(something) -DMPI_CXX_LINK_FLAGS=(something) -DMPI_CXX_COMPILE_FLAGS=(something) -DMPI_CXX_LIBRARIES=(something)

Relevant docs from FindMPI.cmake:

=== Variables ===

#

This module will set the following variables per language in your project,

where is one of C, CXX, or Fortran:

MPI__FOUND TRUE if FindMPI found MPI flags for

MPI__COMPILER MPI Compiler wrapper for

MPI__COMPILE_FLAGS Compilation flags for MPI programs

MPI__INCLUDE_PATH Include path(s) for MPI header

MPI__LINK_FLAGS Linking flags for MPI programs

MPI__LIBRARIES All libraries to link MPI programs against

and

=== Usage ===

#

To use this module, simply call FindMPI from a CMakeLists.txt file, or

run find_package(MPI), then run CMake. If you are happy with the auto-

detected configuration for your language, then you're done. If not, you

have two options:

1. Set MPI__COMPILER to the MPI wrapper (mpicc, etc.) of your

choice and reconfigure. FindMPI will attempt to determine all the

necessary variables using THAT compiler's compile and link flags.

2. If this fails, or if your MPI implementation does not come with

a compiler wrapper, then set both MPI__LIBRARIES and

MPI__INCLUDE_PATH. You may also set any other variables

listed above, but these two are required. This will circumvent

autodetection entirely.

When configuration is successful, MPI__COMPILER will be set to the

compiler wrapper for , if it was found. MPI__FOUND and other

variables above will be set if any MPI implementation was found for ,

regardless of whether a compiler was found.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69151893.

rfvander commented 9 years ago

1) I hate CMake, too, but it’s not going away. Lots of people are moving to it. 2) This is not really a CMake problem. You need to use CC=mpicc and CXX=mpicxx. 3) I tend to use toolchain files for almost every machine I use. See https://github.com/elemental/Elemental/tree/master/cmake/toolchains for some examples.

Best,

Jeff

From: "", Rob F rob.f.van.der.wijngaart@intel.com<mailto:rob.f.van.der.wijngaart@intel.com> Date: Thursday, January 8, 2015 at 12:55 PM To: uwsampa/grappa reply@reply.github.com<mailto:reply@reply.github.com> Cc: "Mattson, Timothy G" timothy.g.mattson@intel.com<mailto:timothy.g.mattson@intel.com>, Jeff R Hammond jeff.r.hammond@intel.com<mailto:jeff.r.hammond@intel.com>, Srinivas Sridharan srinivas.sridharan@intel.com<mailto:srinivas.sridharan@intel.com>, "simon.kahan@gmail.commailto:simon.kahan@gmail.com" simon.kahan@gmail.com<mailto:simon.kahan@gmail.com> Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob,

Can we consider chucking Cmake? If I have such tremendous trouble building on a pretty vanilla system, (almost) everybody else will struggle, too, and the bandwidth of the Grappa team will be strained. I’ll be happy for now with some sample makefiles you’ve generated and finally getting some results using Grappa. Thanks.

Rob

From: Van Der Wijngaart, Rob F Sent: Thursday, January 08, 2015 12:37 PM To: 'uwsampa/grappa' Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob,

Sadly, none of this worked. Configure keeps not wanting to recognize MPI_C(XX)_LIBRARIES, even though I define them (I have already exported CC and CXX, and Cmake does find those): [rfvander@bar1 grappa]$ export MPI_CXX_LIBRARIES=/opt/intel/tools/impi/5.0.1.035/lib64 [rfvander@bar1 grappa]$ export MPI_C_LIBRARIES=/opt/intel/tools/impi/5.0.1.035/lib64 [rfvander@bar1 grappa]$ \rm -rf build [rfvander@bar1 grappa]$ ./configure cmake /lustre/home/rfvander/grappa -G"Unix Makefiles" -DSHMMAX=33554432 -DCMAKE_C_COMPILER=gcc -DCMAKE_CXX_COMPILER=g++ -DBASE_C_COMPILER=gcc -DBASE_CXX_COMPILER=g++ -DCMAKE_BUILD_TYPE=RelWithDebInfo -- The C compiler identification is GNU 4.8.3 -- The CXX compiler identification is GNU 4.8.3 -- Check for working C compiler: /usr/bin/gcc -- Check for working C compiler: /usr/bin/gcc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working CXX compiler: /usr/bin/g++ -- Check for working CXX compiler: /usr/bin/g++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Boost found: 1.53.0 -- /usr CMake Error at /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:108 (message): Could NOT find MPI_C (missing: MPI_C_LIBRARIES) Call Stack (most recent call first): /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE) /usr/share/cmake/Modules/FindMPI.cmake:587 (find_package_handle_standard_args) CMakeLists.txt:205 (find_package)

-- Configuring incomplete, errors occurred! See also "/lustre/home/rfvander/grappa/build/Make+Release/CMakeFiles/CMakeOutput.log". [

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 12:56 AM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

If that doesn't work, according to the FindMPI.cmake source, we should be able to set the include/library paths and flags directly like this:

./configure -- -DMPI_CXX_INCLUDE_PATH=(something) -DMPI_CXX_LINK_FLAGS=(something) -DMPI_CXX_COMPILE_FLAGS=(something) -DMPI_CXX_LIBRARIES=(something)

Relevant docs from FindMPI.cmake:

=== Variables ===

#

This module will set the following variables per language in your project,

where is one of C, CXX, or Fortran:

MPI__FOUND TRUE if FindMPI found MPI flags for

MPI__COMPILER MPI Compiler wrapper for

MPI__COMPILE_FLAGS Compilation flags for MPI programs

MPI__INCLUDE_PATH Include path(s) for MPI header

MPI__LINK_FLAGS Linking flags for MPI programs

MPI__LIBRARIES All libraries to link MPI programs against

and

=== Usage ===

#

To use this module, simply call FindMPI from a CMakeLists.txt file, or

run find_package(MPI), then run CMake. If you are happy with the auto-

detected configuration for your language, then you're done. If not, you

have two options:

1. Set MPI__COMPILER to the MPI wrapper (mpicc, etc.) of your

choice and reconfigure. FindMPI will attempt to determine all the

necessary variables using THAT compiler's compile and link flags.

2. If this fails, or if your MPI implementation does not come with

a compiler wrapper, then set both MPI__LIBRARIES and

MPI__INCLUDE_PATH. You may also set any other variables

listed above, but these two are required. This will circumvent

autodetection entirely.

When configuration is successful, MPI__COMPILER will be set to the

compiler wrapper for , if it was found. MPI__FOUND and other

variables above will be set if any MPI implementation was found for ,

regardless of whether a compiler was found.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69151893.

rfvander commented 9 years ago

Brilliant, this worked, Jeff! Very confusing, you shouldn't have to do it, but it works.

Rob

From: Hammond, Jeff R Sent: Thursday, January 08, 2015 12:58 PM To: Van Der Wijngaart, Rob F; uwsampa/grappa Cc: Mattson, Timothy G; Sridharan, Srinivas; simon.kahan@gmail.com Subject: Re: [grappa] Build problem with MPI library (#198)

1) I hate CMake, too, but it's not going away. Lots of people are moving to it. 2) This is not really a CMake problem. You need to use CC=mpicc and CXX=mpicxx. 3) I tend to use toolchain files for almost every machine I use. See https://github.com/elemental/Elemental/tree/master/cmake/toolchains for some examples.

Best,

Jeff

From: "", Rob F rob.f.van.der.wijngaart@intel.com<mailto:rob.f.van.der.wijngaart@intel.com> Date: Thursday, January 8, 2015 at 12:55 PM To: uwsampa/grappa reply@reply.github.com<mailto:reply@reply.github.com> Cc: "Mattson, Timothy G" timothy.g.mattson@intel.com<mailto:timothy.g.mattson@intel.com>, Jeff R Hammond jeff.r.hammond@intel.com<mailto:jeff.r.hammond@intel.com>, Srinivas Sridharan srinivas.sridharan@intel.com<mailto:srinivas.sridharan@intel.com>, "simon.kahan@gmail.commailto:simon.kahan@gmail.com" simon.kahan@gmail.com<mailto:simon.kahan@gmail.com> Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob,

Can we consider chucking Cmake? If I have such tremendous trouble building on a pretty vanilla system, (almost) everybody else will struggle, too, and the bandwidth of the Grappa team will be strained. I'll be happy for now with some sample makefiles you've generated and finally getting some results using Grappa. Thanks.

Rob

From: Van Der Wijngaart, Rob F Sent: Thursday, January 08, 2015 12:37 PM To: 'uwsampa/grappa' Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob,

Sadly, none of this worked. Configure keeps not wanting to recognize MPI_C(XX)_LIBRARIES, even though I define them (I have already exported CC and CXX, and Cmake does find those): [rfvander@bar1 grappa]$ export MPI_CXX_LIBRARIES=/opt/intel/tools/impi/5.0.1.035/lib64 [rfvander@bar1 grappa]$ export MPI_C_LIBRARIES=/opt/intel/tools/impi/5.0.1.035/lib64 [rfvander@bar1 grappa]$ \rm -rf build [rfvander@bar1 grappa]$ ./configure cmake /lustre/home/rfvander/grappa -G"Unix Makefiles" -DSHMMAX=33554432 -DCMAKE_C_COMPILER=gcc -DCMAKE_CXX_COMPILER=g++ -DBASE_C_COMPILER=gcc -DBASE_CXX_COMPILER=g++ -DCMAKE_BUILD_TYPE=RelWithDebInfo -- The C compiler identification is GNU 4.8.3 -- The CXX compiler identification is GNU 4.8.3 -- Check for working C compiler: /usr/bin/gcc -- Check for working C compiler: /usr/bin/gcc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working CXX compiler: /usr/bin/g++ -- Check for working CXX compiler: /usr/bin/g++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Boost found: 1.53.0 -- /usr CMake Error at /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:108 (message): Could NOT find MPI_C (missing: MPI_C_LIBRARIES) Call Stack (most recent call first): /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE) /usr/share/cmake/Modules/FindMPI.cmake:587 (find_package_handle_standard_args) CMakeLists.txt:205 (find_package)

-- Configuring incomplete, errors occurred! See also "/lustre/home/rfvander/grappa/build/Make+Release/CMakeFiles/CMakeOutput.log". [

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 12:56 AM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

If that doesn't work, according to the FindMPI.cmake source, we should be able to set the include/library paths and flags directly like this:

./configure -- -DMPI_CXX_INCLUDE_PATH=(something) -DMPI_CXX_LINK_FLAGS=(something) -DMPI_CXX_COMPILE_FLAGS=(something) -DMPI_CXX_LIBRARIES=(something)

Relevant docs from FindMPI.cmake:

=== Variables ===

#

This module will set the following variables per language in your project,

where is one of C, CXX, or Fortran:

MPI__FOUND TRUE if FindMPI found MPI flags for

MPI__COMPILER MPI Compiler wrapper for

MPI__COMPILE_FLAGS Compilation flags for MPI programs

MPI__INCLUDE_PATH Include path(s) for MPI header

MPI__LINK_FLAGS Linking flags for MPI programs

MPI__LIBRARIES All libraries to link MPI programs against

and

=== Usage ===

#

To use this module, simply call FindMPI from a CMakeLists.txt file, or

run find_package(MPI), then run CMake. If you are happy with the auto-

detected configuration for your language, then you're done. If not, you

have two options:

1. Set MPI__COMPILER to the MPI wrapper (mpicc, etc.) of your

choice and reconfigure. FindMPI will attempt to determine all the

necessary variables using THAT compiler's compile and link flags.

2. If this fails, or if your MPI implementation does not come with

a compiler wrapper, then set both MPI__LIBRARIES and

MPI__INCLUDE_PATH. You may also set any other variables

listed above, but these two are required. This will circumvent

autodetection entirely.

When configuration is successful, MPI__COMPILER will be set to the

compiler wrapper for , if it was found. MPI__FOUND and other

variables above will be set if any MPI implementation was found for ,

regardless of whether a compiler was found.

Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69151893.

nelsonje commented 9 years ago

First, we've run on multiple machines at PNNL, Sandia, TACC, SDSC, and multiple UW campuses, and we've never seen anything like this before. There's something simple going on here; we just have to figure out what it is. Debugging this is going to take less time than rewriting with GNU autotools.

Second, CC=gcc and CXX=g++ is correct for our build system; the toolchain is supposed to extract libraries/flags from mpicc and then links using the standard compiler.. CC=mpicc can cause other linking problems down the line. I'd still like to figure out what's going on with your configuration to avoid this in the future.

Can we (re)schedule a time for me to poke around?

rfvander commented 9 years ago

Hi Jacob, certainly, I’ll schedule a meeting. But, with all due respect, it is not important that you get things to work on other systems, but that other users get things to work, for Grappa to become a success. And I’d certainly like it to become that. Meanwhile, after I followed Jeff’s suggestion and managed to configure Grappa, I did indeed run into other problems. So far I have not been able to compile any of the examples included in the repository.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 1:36 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

First, we've run on multiple machines at PNNL, Sandia, TACC, SDSC, and multiple UW campuses, and we've never seen anything like this before. There's something simple going on here; we just have to figure out what it is. Debugging this is going to take less time than rewriting with GNU autotools.

Second, CC=gcc and CXX=g++ is correct for our build system; the toolchain is supposed to extract libraries/flags from mpicc and then links using the standard compiler.. CC=mpicc can cause other linking problems down the line. I'd still like to figure out what's going on with your configuration to avoid this in the future.

Can we (re)schedule a time for me to poke around?

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69253621.

nelsonje commented 9 years ago

I completely agree---I'm saying we didn't realize we had a problem with our build system, and I want to track it down and kill it so we don't run into these problems in the future. I appreciate your help with this, and I'm sorry it's taking so long.

rfvander commented 9 years ago

FYI, I have no problems building on Mac OSX with MPICH like this:

jrhammon-mac01:github jrhammon$ CC=/opt/mpich/dev/intel/fast/bin/mpicc CXX=/opt/mpich/dev/intel/fast/bin/mpicxx ./configure

Note that I have to use the intel/fast instance of MPICH on my machine because of stupid dynamic linkage issues that are independent of Grappa.

Jeff

From: "", Rob F rob.f.van.der.wijngaart@intel.com<mailto:rob.f.van.der.wijngaart@intel.com> Date: Thursday, January 8, 2015 at 1:51 PM To: uwsampa/grappa reply@reply.github.com<mailto:reply@reply.github.com>, uwsampa/grappa grappa@noreply.github.com<mailto:grappa@noreply.github.com> Cc: Jeff R Hammond jeff.r.hammond@intel.com<mailto:jeff.r.hammond@intel.com>, Srinivas Sridharan srinivas.sridharan@intel.com<mailto:srinivas.sridharan@intel.com>, "Mattson, Timothy G" timothy.g.mattson@intel.com<mailto:timothy.g.mattson@intel.com> Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob, certainly, I’ll schedule a meeting. But, with all due respect, it is not important that you get things to work on other systems, but that other users get things to work, for Grappa to become a success. And I’d certainly like it to become that. Meanwhile, after I followed Jeff’s suggestion and managed to configure Grappa, I did indeed run into other problems. So far I have not been able to compile any of the examples included in the repository.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 1:36 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

First, we've run on multiple machines at PNNL, Sandia, TACC, SDSC, and multiple UW campuses, and we've never seen anything like this before. There's something simple going on here; we just have to figure out what it is. Debugging this is going to take less time than rewriting with GNU autotools.

Second, CC=gcc and CXX=g++ is correct for our build system; the toolchain is supposed to extract libraries/flags from mpicc and then links using the standard compiler.. CC=mpicc can cause other linking problems down the line. I'd still like to figure out what's going on with your configuration to avoid this in the future.

Can we (re)schedule a time for me to poke around?

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69253621.

rfvander commented 9 years ago

And I actually just did the same myself and managed to build an executable; if only I could now find out where it lives and how to run it :).

Rob

From: Hammond, Jeff R Sent: Friday, January 09, 2015 1:36 PM To: Van Der Wijngaart, Rob F; uwsampa/grappa; uwsampa/grappa Cc: Sridharan, Srinivas; Mattson, Timothy G Subject: Re: [grappa] Build problem with MPI library (#198)

FYI, I have no problems building on Mac OSX with MPICH like this:

jrhammon-mac01:github jrhammon$ CC=/opt/mpich/dev/intel/fast/bin/mpicc CXX=/opt/mpich/dev/intel/fast/bin/mpicxx ./configure

Note that I have to use the intel/fast instance of MPICH on my machine because of stupid dynamic linkage issues that are independent of Grappa.

Jeff

From: "", Rob F rob.f.van.der.wijngaart@intel.com<mailto:rob.f.van.der.wijngaart@intel.com> Date: Thursday, January 8, 2015 at 1:51 PM To: uwsampa/grappa reply@reply.github.com<mailto:reply@reply.github.com>, uwsampa/grappa grappa@noreply.github.com<mailto:grappa@noreply.github.com> Cc: Jeff R Hammond jeff.r.hammond@intel.com<mailto:jeff.r.hammond@intel.com>, Srinivas Sridharan srinivas.sridharan@intel.com<mailto:srinivas.sridharan@intel.com>, "Mattson, Timothy G" timothy.g.mattson@intel.com<mailto:timothy.g.mattson@intel.com> Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob, certainly, I'll schedule a meeting. But, with all due respect, it is not important that you get things to work on other systems, but that other users get things to work, for Grappa to become a success. And I'd certainly like it to become that. Meanwhile, after I followed Jeff's suggestion and managed to configure Grappa, I did indeed run into other problems. So far I have not been able to compile any of the examples included in the repository.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 1:36 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

First, we've run on multiple machines at PNNL, Sandia, TACC, SDSC, and multiple UW campuses, and we've never seen anything like this before. There's something simple going on here; we just have to figure out what it is. Debugging this is going to take less time than rewriting with GNU autotools.

Second, CC=gcc and CXX=g++ is correct for our build system; the toolchain is supposed to extract libraries/flags from mpicc and then links using the standard compiler.. CC=mpicc can cause other linking problems down the line. I'd still like to figure out what's going on with your configuration to avoid this in the future.

Can we (re)schedule a time for me to poke around?

Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69253621.

rfvander commented 9 years ago

try this:

find . -type f -perm 755

:-)

From: "", Rob F rob.f.van.der.wijngaart@intel.com<mailto:rob.f.van.der.wijngaart@intel.com> Date: Friday, January 9, 2015 at 1:59 PM To: Jeff R Hammond jeff.r.hammond@intel.com<mailto:jeff.r.hammond@intel.com>, uwsampa/grappa reply@reply.github.com<mailto:reply@reply.github.com>, uwsampa/grappa grappa@noreply.github.com<mailto:grappa@noreply.github.com> Cc: Srinivas Sridharan srinivas.sridharan@intel.com<mailto:srinivas.sridharan@intel.com>, "Mattson, Timothy G" timothy.g.mattson@intel.com<mailto:timothy.g.mattson@intel.com> Subject: RE: [grappa] Build problem with MPI library (#198)

And I actually just did the same myself and managed to build an executable; if only I could now find out where it lives and how to run it :).

Rob

From: Hammond, Jeff R Sent: Friday, January 09, 2015 1:36 PM To: Van Der Wijngaart, Rob F; uwsampa/grappa; uwsampa/grappa Cc: Sridharan, Srinivas; Mattson, Timothy G Subject: Re: [grappa] Build problem with MPI library (#198)

FYI, I have no problems building on Mac OSX with MPICH like this:

jrhammon-mac01:github jrhammon$ CC=/opt/mpich/dev/intel/fast/bin/mpicc CXX=/opt/mpich/dev/intel/fast/bin/mpicxx ./configure

Note that I have to use the intel/fast instance of MPICH on my machine because of stupid dynamic linkage issues that are independent of Grappa.

Jeff

From: "", Rob F rob.f.van.der.wijngaart@intel.com<mailto:rob.f.van.der.wijngaart@intel.com> Date: Thursday, January 8, 2015 at 1:51 PM To: uwsampa/grappa reply@reply.github.com<mailto:reply@reply.github.com>, uwsampa/grappa grappa@noreply.github.com<mailto:grappa@noreply.github.com> Cc: Jeff R Hammond jeff.r.hammond@intel.com<mailto:jeff.r.hammond@intel.com>, Srinivas Sridharan srinivas.sridharan@intel.com<mailto:srinivas.sridharan@intel.com>, "Mattson, Timothy G" timothy.g.mattson@intel.com<mailto:timothy.g.mattson@intel.com> Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob, certainly, I’ll schedule a meeting. But, with all due respect, it is not important that you get things to work on other systems, but that other users get things to work, for Grappa to become a success. And I’d certainly like it to become that. Meanwhile, after I followed Jeff’s suggestion and managed to configure Grappa, I did indeed run into other problems. So far I have not been able to compile any of the examples included in the repository.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 1:36 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

First, we've run on multiple machines at PNNL, Sandia, TACC, SDSC, and multiple UW campuses, and we've never seen anything like this before. There's something simple going on here; we just have to figure out what it is. Debugging this is going to take less time than rewriting with GNU autotools.

Second, CC=gcc and CXX=g++ is correct for our build system; the toolchain is supposed to extract libraries/flags from mpicc and then links using the standard compiler.. CC=mpicc can cause other linking problems down the line. I'd still like to figure out what's going on with your configuration to avoid this in the future.

Can we (re)schedule a time for me to poke around?

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69253621.

rfvander commented 9 years ago

Found it. Now I need to find out how to run it.

Rob

From: Van Der Wijngaart, Rob F Sent: Friday, January 09, 2015 2:00 PM To: Hammond, Jeff R; uwsampa/grappa; uwsampa/grappa Cc: Sridharan, Srinivas; Mattson, Timothy G Subject: RE: [grappa] Build problem with MPI library (#198)

And I actually just did the same myself and managed to build an executable; if only I could now find out where it lives and how to run it :).

Rob

From: Hammond, Jeff R Sent: Friday, January 09, 2015 1:36 PM To: Van Der Wijngaart, Rob F; uwsampa/grappa; uwsampa/grappa Cc: Sridharan, Srinivas; Mattson, Timothy G Subject: Re: [grappa] Build problem with MPI library (#198)

FYI, I have no problems building on Mac OSX with MPICH like this:

jrhammon-mac01:github jrhammon$ CC=/opt/mpich/dev/intel/fast/bin/mpicc CXX=/opt/mpich/dev/intel/fast/bin/mpicxx ./configure

Note that I have to use the intel/fast instance of MPICH on my machine because of stupid dynamic linkage issues that are independent of Grappa.

Jeff

From: "", Rob F rob.f.van.der.wijngaart@intel.com<mailto:rob.f.van.der.wijngaart@intel.com> Date: Thursday, January 8, 2015 at 1:51 PM To: uwsampa/grappa reply@reply.github.com<mailto:reply@reply.github.com>, uwsampa/grappa grappa@noreply.github.com<mailto:grappa@noreply.github.com> Cc: Jeff R Hammond jeff.r.hammond@intel.com<mailto:jeff.r.hammond@intel.com>, Srinivas Sridharan srinivas.sridharan@intel.com<mailto:srinivas.sridharan@intel.com>, "Mattson, Timothy G" timothy.g.mattson@intel.com<mailto:timothy.g.mattson@intel.com> Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob, certainly, I'll schedule a meeting. But, with all due respect, it is not important that you get things to work on other systems, but that other users get things to work, for Grappa to become a success. And I'd certainly like it to become that. Meanwhile, after I followed Jeff's suggestion and managed to configure Grappa, I did indeed run into other problems. So far I have not been able to compile any of the examples included in the repository.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 1:36 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

First, we've run on multiple machines at PNNL, Sandia, TACC, SDSC, and multiple UW campuses, and we've never seen anything like this before. There's something simple going on here; we just have to figure out what it is. Debugging this is going to take less time than rewriting with GNU autotools.

Second, CC=gcc and CXX=g++ is correct for our build system; the toolchain is supposed to extract libraries/flags from mpicc and then links using the standard compiler.. CC=mpicc can cause other linking problems down the line. I'd still like to figure out what's going on with your configuration to avoid this in the future.

Can we (re)schedule a time for me to poke around?

Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69253621.

nelsonje commented 9 years ago

Yes, OS X and MPICH have worked for us too. Currently there are a couple problems running, though, which is why we're not claiming to support OS X yet. In the next month or so I hope to make some changes to our communication layer to fix that problem.

BTW, Jeff, something about replying to this via email makes GitHub think it's coming from Rob.

On Fri, Jan 9, 2015 at 1:36 PM, rfvander notifications@github.com wrote:

FYI, I have no problems building on Mac OSX with MPICH like this:

jrhammon-mac01:github jrhammon$ CC=/opt/mpich/dev/intel/fast/bin/mpicc CXX=/opt/mpich/dev/intel/fast/bin/mpicxx ./configure

Note that I have to use the intel/fast instance of MPICH on my machine because of stupid dynamic linkage issues that are independent of Grappa.

Jeff

From: "", Rob F <rob.f.van.der.wijngaart@intel.com mailto:rob.f.van.der.wijngaart@intel.com> Date: Thursday, January 8, 2015 at 1:51 PM To: uwsampa/grappa reply@reply.github.com<mailto:reply@reply.github.com>, uwsampa/grappa grappa@noreply.github.com<mailto:grappa@noreply.github.com>

Cc: Jeff R Hammond <jeff.r.hammond@intel.com<mailto: jeff.r.hammond@intel.com>>, Srinivas Sridharan < srinivas.sridharan@intel.commailto:srinivas.sridharan@intel.com>, "Mattson, Timothy G" <timothy.g.mattson@intel.com<mailto: timothy.g.mattson@intel.com>> Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob, certainly, I’ll schedule a meeting. But, with all due respect, it is not important that you get things to work on other systems, but that other users get things to work, for Grappa to become a success. And I’d certainly like it to become that. Meanwhile, after I followed Jeff’s suggestion and managed to configure Grappa, I did indeed run into other problems. So far I have not been able to compile any of the examples included in the repository.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 1:36 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

First, we've run on multiple machines at PNNL, Sandia, TACC, SDSC, and multiple UW campuses, and we've never seen anything like this before. There's something simple going on here; we just have to figure out what it is. Debugging this is going to take less time than rewriting with GNU autotools.

Second, CC=gcc and CXX=g++ is correct for our build system; the toolchain is supposed to extract libraries/flags from mpicc and then links using the standard compiler.. CC=mpicc can cause other linking problems down the line. I'd still like to figure out what's going on with your configuration to avoid this in the future.

Can we (re)schedule a time for me to poke around?

— Reply to this email directly or view it on GitHub< https://github.com/uwsampa/grappa/issues/198#issuecomment-69253621>.

— Reply to this email directly or view it on GitHub https://github.com/uwsampa/grappa/issues/198#issuecomment-69403091.

rfvander commented 9 years ago

My experiment was on Linux, the system we were working on together yesterday, Jacob. So that’s sweet success. Should I look for grapparun?

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Friday, January 09, 2015 2:02 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Yes, OS X and MPICH have worked for us too. Currently there are a couple problems running, though, which is why we're not claiming to support OS X yet. In the next month or so I hope to make some changes to our communication layer to fix that problem.

BTW, Jeff, something about replying to this via email makes GitHub think it's coming from Rob.

On Fri, Jan 9, 2015 at 1:36 PM, rfvander notifications@github.com<mailto:notifications@github.com> wrote:

FYI, I have no problems building on Mac OSX with MPICH like this:

jrhammon-mac01:github jrhammon$ CC=/opt/mpich/dev/intel/fast/bin/mpicc CXX=/opt/mpich/dev/intel/fast/bin/mpicxx ./configure

Note that I have to use the intel/fast instance of MPICH on my machine because of stupid dynamic linkage issues that are independent of Grappa.

Jeff

From: "", Rob F <rob.f.van.der.wijngaart@intel.com mailto:rob.f.van.der.wijngaart@intel.com%20%0b> mailto:rob.f.van.der.wijngaart@intel.com> Date: Thursday, January 8, 2015 at 1:51 PM To: uwsampa/grappa reply@reply.github.com<mailto:reply@reply.github.com<mailto:reply@reply.github.com%3cmailto:reply@reply.github.com>>, uwsampa/grappa grappa@noreply.github.com<mailto:grappa@noreply.github.com<mailto:grappa@noreply.github.com%3cmailto:grappa@noreply.github.com>>

Cc: Jeff R Hammond <jeff.r.hammond@intel.com<mailto: mailto:jeff.r.hammond@intel.com%3cmailto:%20%0b> jeff.r.hammond@intel.commailto:jeff.r.hammond@intel.com>>, Srinivas Sridharan < srinivas.sridharan@intel.commailto:srinivas.sridharan@intel.commailto:srinivas.sridharan@intel.com%3cmailto:srinivas.sridharan@intel.com>, "Mattson, Timothy G" <timothy.g.mattson@intel.com<mailto: mailto:timothy.g.mattson@intel.com%3cmailto:%20%0b> timothy.g.mattson@intel.commailto:timothy.g.mattson@intel.com>> Subject: RE: [grappa] Build problem with MPI library (#198)

Hi Jacob, certainly, I’ll schedule a meeting. But, with all due respect, it is not important that you get things to work on other systems, but that other users get things to work, for Grappa to become a success. And I’d certainly like it to become that. Meanwhile, after I followed Jeff’s suggestion and managed to configure Grappa, I did indeed run into other problems. So far I have not been able to compile any of the examples included in the repository.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Thursday, January 08, 2015 1:36 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

First, we've run on multiple machines at PNNL, Sandia, TACC, SDSC, and multiple UW campuses, and we've never seen anything like this before. There's something simple going on here; we just have to figure out what it is. Debugging this is going to take less time than rewriting with GNU autotools.

Second, CC=gcc and CXX=g++ is correct for our build system; the toolchain is supposed to extract libraries/flags from mpicc and then links using the standard compiler.. CC=mpicc can cause other linking problems down the line. I'd still like to figure out what's going on with your configuration to avoid this in the future.

Can we (re)schedule a time for me to poke around?

— Reply to this email directly or view it on GitHub< https://github.com/uwsampa/grappa/issues/198#issuecomment-69253621>.

— Reply to this email directly or view it on GitHub https://github.com/uwsampa/grappa/issues/198#issuecomment-69403091.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69406646.

nelsonje commented 9 years ago

Fantastic!

If you did "make demo-hello_world" in build/Make+Release and your cluster uses Slurm, you should be able to do something like

bin/grappa_run --nnode 4 --ppn 4 applications/demos/hello_world.exe

in build/Make+Release. (yes, we add .exe to binaries on Linux....)

rfvander commented 9 years ago

Sadly, this is what happened: [rfvander@bar1 Make+Release]$ bin/grappa_run --nnode 4 --ppn 4 applications/demos/hello_world.exe Unsupported MPI version: Intel(R) MPI Library for Linux* OS, 64-bit applications, Version 5.0 Update 1 Build 20140709 Copyright (C) 2003-2014 Intel Corporation. All rights reserved..

How do I find out whether my cluster uses slurm?

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Friday, January 09, 2015 2:08 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Fantastic!

If you did "make demo-hello_world" in build/Make+Release and your cluster uses Slurm, you should be able to do something like

bin/grappa_run --nnode 4 --ppn 4 applications/demos/hello_world.exe

(yes, we add .exe to binaries on Linux)

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69407367.

nelsonje commented 9 years ago

Ah, that's an overly-restrictive test in that script.

How do you run jobs? If it involves srun or salloc, we should be close to working. If not we'll have to figure out the right flags to pass environment variables to child processes.

rfvander commented 9 years ago

I normally run with mpirun (single node), or mpiexec.hydra (multiple nodes).

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Friday, January 09, 2015 2:15 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Ah, that's an overly-restrictive test in that script.

How do you run jobs? If it involves srun or salloc, we should be close to working. If not we'll have to figure out the right flags to pass environment variables to child processes.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69408312.

nelsonje commented 9 years ago

I see. Does that interact with a job scheduler (Slurm, Torque, PBS, etc.) or do you avoid collisions between jobs in some other way?

In any case, I'll look at the Intel MPI docs and figure out the right command to pass down all the environment variables we need to run with mpiexec.hydra alone.

On Fri, Jan 9, 2015 at 2:16 PM, rfvander notifications@github.com wrote:

I normally run with mpirun (single node), or mpiexec.hydra (multiple nodes).

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Friday, January 09, 2015 2:15 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Ah, that's an overly-restrictive test in that script.

How do you run jobs? If it involves srun or salloc, we should be close to working. If not we'll have to figure out the right flags to pass environment variables to child processes.

— Reply to this email directly or view it on GitHub< https://github.com/uwsampa/grappa/issues/198#issuecomment-69408312>.

— Reply to this email directly or view it on GitHub https://github.com/uwsampa/grappa/issues/198#issuecomment-69408608.

rfvander commented 9 years ago

Hi Jacob,

This is the situation on the clusters I use:

  1.  Non-production systems: No job scheduler at all, collision avoidance is the responsibility of the users of the cluster.
  2.  Production systems: LSF to get an allocation on the machine, and once you have it, you run mpiexec.hydra unimpeded.

    So in either case, there is no interaction with a job scheduler.

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Friday, January 09, 2015 2:22 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

I see. Does that interact with a job scheduler (Slurm, Torque, PBS, etc.) or do you avoid collisions between jobs in some other way?

In any case, I'll look at the Intel MPI docs and figure out the right command to pass down all the environment variables we need to run with mpiexec.hydra alone.

On Fri, Jan 9, 2015 at 2:16 PM, rfvander notifications@github.com wrote:

I normally run with mpirun (single node), or mpiexec.hydra (multiple nodes).

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Friday, January 09, 2015 2:15 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

Ah, that's an overly-restrictive test in that script.

How do you run jobs? If it involves srun or salloc, we should be close to working. If not we'll have to figure out the right flags to pass environment variables to child processes.

— Reply to this email directly or view it on GitHub< https://github.com/uwsampa/grappa/issues/198#issuecomment-69408312>.

— Reply to this email directly or view it on GitHub https://github.com/uwsampa/grappa/issues/198#issuecomment-69408608.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69409417.

nelsonje commented 9 years ago

I updated the grappa_run script to accept Intel MPI. If you do a pull from master, you should be able to run a command like

bin/grappa_run --mpi=true --nnode=4 --ppn=4 -- applications/demos/hello_world.exe

and it will call mpiexec.hydra with the right flags.

rfvander commented 9 years ago

OK, thanks. And how do I indicate the host file that lists the nodes on which to run?

Rob

From: Jacob Nelson [mailto:notifications@github.com] Sent: Friday, January 09, 2015 3:04 PM To: uwsampa/grappa Cc: Van Der Wijngaart, Rob F Subject: Re: [grappa] Build problem with MPI library (#198)

I updated the grappa_run script to accept Intel MPI. If you do a pull from master, you should be able to run a command like

bin/grappa_run --mpi=true --nnode=4 --ppn=4 -- applications/demos/hello_world.exe

and it will call mpiexec.hydra with the right flags.

— Reply to this email directly or view it on GitHubhttps://github.com/uwsampa/grappa/issues/198#issuecomment-69415785.