Closed zbeekman closed 7 years ago
I'd like for OpenCoarrays to be available by any and every mechanism possible. Are you offering to assist with the set-up or asking about our plans. Help is always welcome. We haven't scoped it out, but Alessandro and I are talking tomorrow and will add this to the agenda. Please create an entry for this in our Issues page.
D
Sent from my iPhone
On Nov 23, 2015, at 9:42 AM, Izaak Beekman notifications@github.com wrote:
I wanted to gauge interest/support for creating and submitting a Homebrew Formula to build & install opencoarrays.
Pros: Better package manager for OS X (IMO) Reach a larger user base, including those who shun MacPorts pre-requisites (CMake, MPICH, GCC/GFortran---5.2) managed and installed via HomeBrew, and supports CMake build & install systems natively. Cons: No download tracking feature, that I am aware of.... — Reply to this email directly or view it on GitHub.
I'm offering to take a crack at doing it myself. I am the maintainer of a number of Fortran related Homebrew Formulae: JSON-Fortran, FoBiS.py, and FORD.
It looks like there may be a way to track some of the usage using Homebrew: https://github.com/Homebrew/homebrew/issues/46289#issuecomment-159009961
(A homebrew "bottle" is a package of binaries which is built when the formula is tested. Unless the user requests an install with non-standard options or from the "HEAD" source, should one be provided in the formula, the binary assets built during the Formula testing are downloaded and installed, rather than building the assets from source again. A bottle is created for each of the three most recent OS X releases, and Homebrew fetches and installs the appropriate one.)
On Nov 23, 2015, at 9:50 AM, Izaak Beekman notifications@github.com wrote:
I'm offering to take a crack at doing it myself. I am the maintainer of a number of Fortran related Homebrew Formulae: JSON-Fortran, FoBiS.py, and FORD.
Awesome! Thanks tons. We look forward to hearing how it goes.
D
If it's accepted, then the statistics tracking page will look something like this: https://bintray.com/homebrew/bottles/a2ps/view#statistics
If I'm successful and it is accepted, I would encourage you to drive people towards Homebrew over MacPorts for better download tracking and ease of use... but that's up to you guys.
On Nov 23, 2015, at 9:57 AM, Izaak Beekman notifications@github.com wrote:
It looks like there may be a way to track some of the usage using Homebrew: Homebrew/homebrew#46289 (comment) https://github.com/Homebrew/homebrew/issues/46289#issuecomment-159009961
Great news. If this works out, we’ll add you to the list of contributors on OpenCoarrays.org and feature your Homebrew option in the installation instructions and recommend it as preferable to MacPorts.
D
I saw the 1.0.0 branch which looks like it includes a bunch of install/build/dependency stuff. Are there going to be major changes in the build/install process from the current master branch? Should I hold off on the Homebrew Formula until 1.x? I started to work on it, but haven't had more than a few minutes. I downloaded a bunch of documentation and started the process of writing the formula, so I can work on it in the car tomorrow. Do you think I should wait for 1.x.x or go ahead and write the formula for OCA now?
Good question. For someone who is comfortable with the ahem very minimal CMake commands described in INSTALL.md, nothing will change.
The reason for the install.sh script is that even very smart users are intimidated by CMake (with at least some good reason) and so my goal is to make installation absolutely as simple as possible. I want a default installation to require nothing more than
./install.sh
after which all primary, secondary, tertiary, etc. prerequisites will be detected or downloaded, built, and installed and the user just sits back and watched the magic happen. Phew... It's a tall order, but I believe I've reached a point where the dream is within reach and the primary requirement at his point is just more testing and more bug reports on as many systems as possible.
I hope to push release 1.2.1 with significant improvements to the installation process within 24 hours.
D
Sent from my iPhone
On Nov 28, 2015, at 9:34 PM, Izaak Beekman notifications@github.com wrote:
I saw the 1.0.0 branch which looks like it includes a bunch of install/build/dependency stuff. Are there going to be major changes in the build/install process from the current master branch? Should I hold off on the Homebrew Formula until 1.x? I started to work on it, but haven't had more than a few minutes. I downloaded a bunch of documentation and started the process of writing the formula, so I can work on it in the car tomorrow. Do you think I should wait for 1.x.x or go ahead and write the formula for OCA now?
— Reply to this email directly or view it on GitHub.
Ok, great. I'll probably wait for the next release. The Homebrew formula includes a SHA256 of the release tar balls, so every time a new release happens the formula must be updated with the version, tarball URL, and SHA256.
Also, FYI, did you know that you can embed compressed tarballs in install scripts? I first learned about this because CPack provides this functionality. It means that users don't even need to know how to untar the distributed installation files, only run ./install.sh
. I can send a link later this week and/or submit a PR with a script to build the script for each release.
One last question, do you know if OCA will work with clang as the mpicc compiler? It seems that the Homebrew MPICH prefers to use clang over GCC.
Hi Izaak,
My script now functions largely as I’d hope. The primary challenge is the combinatorial explosion of possibilities for a given dependency to be missing, present in the user’s PATH with a sufficiently recent version number, present in the user’s PATH with an insufficient version number, or present in the default OpenCoarrays installation location for the particular dependency. Currently, when CMake is missing and the installation script has to install it, I run into an error that apparently you’ve encountered before:
https://cmake.org/pipermail/cmake/2013-October/056181.html
Any advice? It would be great to set up some time to talk about this during the week if you’r available.
Damian Rouson, Ph.D., P.E. President, Sourcery Institute http://www.sourceryinstitute.org +1-510-600-2992 (mobile)
On Nov 29, 2015, at 6:13 AM, Izaak Beekman notifications@github.com wrote:
Ok, great. I'll probably wait for the next release. The Homebrew formula includes a SHA256 of the release tar balls, so every time a new release happens the formula must be updated with the version, tarball URL, and SHA256.
Also, FYI, did you know that you can embed compressed tarballs in install scripts? I first learned about this because CPack provides this functionality. It means that users don't even need to know how to untar the distributed installation files, only run ./install.sh. I can send a link later this week and/or submit a PR with a script to build the script for each release.
One last question, do you know if OCA will work with clang as the mpicc compiler? It seems that the Homebrew MPICH prefers to use clang over GCC.
— Reply to this email directly or view it on GitHub https://github.com/sourceryinstitute/opencoarrays/issues/22#issuecomment-160418466.
Hi Damian, Let's break this discussion regarding CMake installation issues into a new issue, or email as it doesn't pertain to the Homebrew Formula.
Hi Damian,
As part of the formula, a description is required. The length of name: <desc>
needs to be less than 80 characters. "Opencoarrays" is 12 chars + 2 for the colon and space, leaving us with 66 characters to work with. Right now this is what I have as the description, but thought I would run it by you:
Coarray Fortran RTL and toolchain to implemented on top of MPI.
As it stands this is 63 chars.
Another thought:
Free & open Coarray Fortran runtime library and toolchain.
or better yet:
Coarray Fortran runtime library and toolchain, free & open source
@rouson This is what I think is best for the description, but I'd love your input (please see above for alternatives):
Coarray Fortran runtime library and toolchain, free & open source
The Formula is nearly done, just waiting on some feedback from @afanfa on #28 and some feedback from @homebrew regarding best practices and favoring MPICH over OpenMPI: https://github.com/Homebrew/homebrew/issues/46551
Hi Izaak,
Let's go with "open-source coarray Fortran ABI, API, and compiler wrapper" if that sounds ok to you. My explanation is below.
I frequently use the term "runtime library", but I think Alessandro has identified some ways in which it might not quite fit what OpenCoarrays is. For that reason, we initially went with the description "transport layer" on www.opencoarrays.org. Transport layer always seemed pretty vague to me, but I've lately come to the impression that it's a lot more common and more meaningful to computer scientists -- especially folks steeped in network protocols -- than it is to applications folks like you and me.
Lately, I'm evolving toward application binary interface (ABI) and application programmer interface (API). Both are a bit more widely used (especially API) and more descriptive to my way of thinking. Our ABI is the set of functions the compiler calls and it's actually how the gfortran documentation refers to OpenCoarrays (see https://gcc.gnu.org/onlinedocs/gfortran/Coarray-Programming.html). Effectively, it's libcaf_x.a in its many forms, where "x" can be MPI or a CUDA-acclerated MPI or GASNet or something else. Our API is opencoarrays.mod, which is produced by compiling extensions/opencoarrays.f90. It contains Fortran wrappers for the ABI and is especially useful for extending the capabilities of compilers that withe don't yet support CAF or don't support some of the CAF features we support. In addition to the ABI and API, we also provide a compiler wrapper (caf) that does some minima code transformations for users of non-CAF compilers and we provide a program launcher that is a very minimalistic wrapper for mpirun.
Damian
Great! I'll update in the AM. The formula PR is getting close.
OK, I seem to be close to having the formula refined, but there are some lingering issues with Homebrew's MPI implementations that were recently introduced (bugs, I think). Here is a link to my PR as it evolves: homebrew/homebrew#46547
Here are the discussions for the issues holding it up: homebrew/homebrew#46422 and homebrew/homebrew#46461
I'm going to try to get GASNet into Homebrew as well, in the event that we want to provide an option to build and link against that. However, if @rouson or @afanfa could help by providing me with a minimal example, preferably in C, of a program that links against GASNet (implemented on top of MPI) and how to compile it that would help me expedite this process. It could be as simple as calling gasnet_init
and then gasnet_exit
.
It would be fabulous to have an option to link against GASNet as an alternative to MPI when building with CMake. Thanks for looking into this!
The following, case-insensitive, recursive search for "gasnet" in Makefiles might be helpful:
$ grep -i gasnet -r . | grep -i makefile ./integration/dist_transpose/Makefile_NS_GASNET:include /scratch2/scratchdirs/afanfa/GASNet-1.22.4/aries-conduit/aries-par.mak #/scratch/scratchdirs/afanfa/GASNet-1.22.4/gemini-conduit/gemini-par.mak ./integration/dist_transpose/Makefile_NS_GASNET:opencoarrays_dir=/global/u1/a/afanfa/Coarray/opencoarrays/gasnet ./integration/dist_transpose/Makefile_NS_GASNET: lib=-lcaf_gasnet ./integration/dist_transpose/Makefile_NS_GASNET: $(GASNET_LD) $(GASNET_LDFLAGS) $(LIBCAF_FLAGS) $(opt) coarray_distributed_transpose.o $(objects) -lgfortran -lm -o $(executable) $(lib) $(GASNET_LIBS) ./integration/pde_solvers/navier-stokes/Makefile_NS_GASNET:include /home/rouson/Downloads/GASNet-1.22.4/smp-conduit/smp-par.mak ./integration/pde_solvers/navier-stokes/Makefile_NS_GASNET: lib=-lcaf_gasnet ./integration/pde_solvers/navier-stokes/Makefile_NS_GASNET: $(GASNET_LD) $(GASNET_LDFLAGS) $(LIBCAF_FLAGS) $(opt) coarray-shear.o $(objects) -lgfortran -lm -o $(executable) $(lib) $(fft) $(GASNET_LIBS) ./unit/simple/Makefile:.SUFFIXES: .f90 .armci .mpi .gasnet ./unit/simple/Makefile:all: $(OBJS) armci gasnet mpi ./unit/simple/Makefile: /bin/rm -fr .o .armci .mpi .gasnet ./unit/simple/Makefile:gasnet: $(EXES:.exe=.gasnet) ./unit/simple/Makefile:.o.gasnet: ./unit/simple/Makefile: $(MPFC) -o $@ $< -lcaf_gasnet -L$(TOP)/gasnet $(GASNET_LDFLAGS)
Beside to start with something very simple. The GASNet transport layer (libcaf_gasnet) has not been maintained in roughly 18 months so its interface might have diverged from that of the MPI layer (libcaf_mpi). Please let us know if you run into any problems.
I'll be sure to keep you posted. I also heard back from Dan Bonachea who pointed me in the right direction, I think.
oy, it was a bit of a PITA, but I submitted a PR to get GASNet into Homebrew... waiting for them to resolve MPI issues still, and then I can investigate optionally building OCA against GASNet in the Homebrew formula.
GASNet is now available through Homebrew. This should help us add an option for a GASNet linked version of OCA to the Homebrew formula. Still waiting on the resolution of the MPI issues with Homebrew.
GASNet over MPI is the wrong thing to use in nearly all cases, and particularly for Mac, which is always a single-node. You should build the SMP and maybe the UDP conduits. These will perform much better.
As I documented in the PRK Travis infrastructure, GASNet with the MPI conduit hangs on Mac, at least with Berkeley UPC, whereas the SMP and UDP conduits are fine (all three are fine on Linux, FWIW).
@jeffhammond thanks for testing the GASNet implementation on OS X. I'll see what I can do to get a UDP and/or TCP implementation into Homebrew.
GASNet over MPI is the wrong thing to use in nearly all cases, and particularly for Mac, which is always a single-node. You should build the SMP and maybe the UDP conduits. These will perform much better.
Just to respond in a little bit more detail: I agree, of course the performance of TCP/UDP conduits on single node Macs will be better than MPI. My main desire for getting GASNet into Homebrew in the first place, however, was to enable testing of more serious HPC software in a local environment, performance was never a priority. Rather, I seem to recall having some issues getting the TCP/UDP builds working. I'll take another stab at getting the TCP/UDP conduits to compile via Homebrew. If I run into trouble, would you, @jeffhammond, be willing to have a short call to discuss how you build GASNet on OS X?
As I documented in the PRK Travis infrastructure, GASNet with the MPI conduit hangs on Mac, at least with Berkeley UPC, whereas the SMP and UDP conduits are fine (all three are fine on Linux, FWIW).
Very cool! This Travis-CI setup is a gem! Thanks for sharing it.
@zbeekman Sure. Google Chat is a low-latency, asynchronous way to reach me, and we can initiate synchronous mode from there as appropriate. My Google handle is listed on my Github profile.
While https://github.com/ParRes/Kernels/blob/master/travis/install-berkeley-upc.sh builds Berkeley UPC, most of the options are just drop-throughs to GASNet. It is on my TODO list to build GASNet itself since I need that for other things (including OpenCoarrays) anyways. I'll try to create the GASNet-only script soon and you can borrow from it as much as you like.
Indeed, I think the PRK Travis setup is about as exhaustive as it comes regarding parallel runtime systems. It took a lot of work...
OK, this issue may actually get closed in the not so distant future... we'll see! Here is a comment from the old PR about setting up a GCC requirement class that I may need to revisit: https://github.com/Homebrew/legacy-homebrew/pull/46547#r55078591
Here is the :sparkles: new :sparkles: PR against homebrew-core: https://github.com/Homebrew/homebrew-core/pull/8790
Very cool. Just as an FYI, I changed jobs within Intel and the PRK project is much lower priority now. That isn't to say I don't care or that I won't fix any bugs, but adding features is unlikely for a while because I have other things keeping me busy enough.
@jeffhammond congrats on the new job! Still at Intel or move somewhere else? Feel free to take this to a private thread, zbeekman at gmail dot com or gchat...
@zbeekman "...within Intel..." 😉 I'll email you details.
Merged in https://github.com/Homebrew/homebrew-core/commit/4644642650a37c2187ab8292e80eb238b40cd229
brew update
brew install opencoarrays
I'll try to post a link to the bintray analytics soon
@rouson you can see download stats here: https://bintray.com/homebrew/bottles/opencoarrays#statistics
I wanted to gauge interest/support for creating and submitting a Homebrew Formula to build & install opencoarrays.
- Cons:
@zbeekman Broken for older systems, little customization.
I wanted to gauge interest/support for creating and submitting a Homebrew Formula to build & install opencoarrays.