CODARcode / Example-Heat_Transfer

3 stars 3 forks source link

= Running the Heat Transfer/Reduction pipeline with Swift

//// This lead syntax does not work with the asciidoc command line tool ////

[.lead] This example shows how to run the heat transfer/reduction pipeline with Swift. This application works as shown in the figure, with the grey callouts indicating user-controlled options for the program.

image::CODAR-fig.jpg[]

Identity Compress via SZ, with specified accuracy ** Compress via ZFP, with specified accuracy

FLEXPATH: The only one tested so far. Description TBD MPI: Description TBD * Dataspace: Description TBD*

In the following, we describe in turn:

== Installation

. We assume that Spack is installed on your system. But in case of interest, these steps work on a Linux Ubuntu workstation (some elements, such as zlib and zsh may already be available on some systems): + sudo apt install git sudo apt install curl sudo apt install gfortran sudo apt install bison sudo apt install flex sudo apt install environment-modules sudo apt install zlib sudo apt install zsh git clone https://github.com/CODARcode/spack . spack/share/spack/setup-env.sh spack compilers + Then edit ~/.spack/compilers.yaml (on Ubuntu, at least, it's in ~/.spack/linux/compilers.yaml) and add /usr/bin/gfortran for both f77 and fcc. + It seems to help to restart at this point.

. Install Savanna via Spack and then spack load required modules: + cd spack git checkout -t origin/codar spack install savanna@develop spack load stc turbine adios mpich sz dataspaces + . Build the heat transfer simulator and stager .. Determine the location of the code (spack find -p savanna), say $L, and cd $L/Heat_Transfer. The following will work as long as you only have one Savanna installed: + path2savanna = spack find -p savanna | grep savanna | awk '{ print $2 }' cd $path2savanna/Heat_Transfer

.. Edit Makefile to set CC=mpicc and FC=mpif90 .. Build heat transfer simulator and stager: + make cd stage_write make

== Configure and run the basic pipeline, with and without compression

. Run the workflow .. Configure staging method ... To use FlexPath: edit heat_transfer.xml to enable the FLEXPATH method. (Ensure that the following text, near the end of the file, is not commented out: <method group="heat" method="FLEXPATH">QUEUE_SIZE=4;verbose=3</method>. All other lines starting with <method group="heat" method... should be commented out.) ... To use DataSpaces: edit heat_transfer.xml to enable the DATASPACES method. (In this case the following line should not be commented out: <method group="heat" method="DATASPACES"/>, while all other lines starting with <method group="heat" method... should be commented out.) ... To use MPI: edit heat_transfer.xml to enable the MPI method. (In this case the following line should not be commented out: <method group="heat" method="MPI"/>, while all other lines starting with <method group="heat" method... should be commented out.). .. Edit run-workflow.sh to set LAUNCH to point to /dir/to/mpix-launch-swift/src. For an example, we can do with Spack as follows: + LAUNCH=$(spack find -p mpix-launch-swift | grep mpix-launch-swift | awk '{print $2}')/src + .. You can also edit workflow.swift to modify the number of processes and problem size .. Then, type + ./run-workflow.sh P [DATASPACES|FLEXPATH|MPI] + where P is the number of MPI processes. (With default settings in workflow.swift, 12 processes are used for the simulation and 3 are used for the stager, requiring P=12 + 3 + 1 for Swift = 16. If DataSpaces transport method is used, then an additional process will be required, for a total of 17. For a quick run, edit workflow.swift to set 1 each for both simulation and stager, thus requiring P=1+1+1=3 (four with DataSpaces.) The second option specifies the staging transport. If no staging transport is specified, FlexPath will be used.

. Run with compression

To perform compression through ADIOS, we need to provide additional options for stager:

These are all specified in the arguments2 variable in "workflow.swift" (line 62).

For example, the following line requests stager to compress the T and dT variables with the SZ method, maintaining absolute errors lower than 0.001 (The latter is a SZ-specific parameter. More details can be found in the ADIOS manual).


arguments2 = split("heat.bp staged.bp FLEXPATH \"\" MPI \"\" \"T,dt\" \"SZ:accuracy=0.001\"", " ");

This second example does the same thing but used the ZFP compression library:

arguments2 = split("heat.bp staged.bp FLEXPATH \"\" MPI \"\" \"T,dt\" \"ZFP:accuracy=0.001\"", " ");

== Run a multiple-experiment campaign with Cheetah

We have described how to execute a single instance of the pipeline. The Cheetah system allows you to run a campaign involving a set of such executions, each with different configuration parameters.

Instructions on how to run with Cheetah are https://github.com/CODARcode/cheetah[on a separate page].

== Developer quick start notes

These notes are for developers that want to work from git clones, not Spack.

. Install all APT packages in the <<user-content-installation,Installation>> section above (or equivalent on your OS) +

$ sudo apt-get update $ sudo apt-get -y install gcc g++ gfortran mpich $ sudo apt-get -y install tcl tcl-dev swig zsh ant $ sudo apt-get -y install build-essential autoconf $ sudo apt-get -y install libxml2 libxml2-dev gsoap $ sudo apt-get -y install bison flex $ sudo apt-get -y install cmake subversion git

+ . Download/install EVPATH/FlexPath + Directions here: https://www.cc.gatech.edu/systems/projects/EVPath +

$ mkdir evpath-build $ cd evpath-build $ wget http://www.cc.gatech.edu/systems/projects/EVPath/chaos_bootstrap.pl $ perl ./chaos_bootstrap.pl -i

+ Create the evpath installation directory in advance, and specify it when running "perl ./chaos_bootstrap.pl -i" Now edit chaos_build_config to remove the BUILDLIST entries after evpath with a comment (%). Then: +

$ perl ./chaos_build.pl

+ NOTE: nnti will fail, that is OK. + . Download/install ADIOS +

$ wget http://users.nccs.gov/~pnorbert/adios-1.11.0.tar.gz $ export LIBS=-pthread $ ./configure --prefix=... --with-flexpath=...

+ Then put the ADIOS bin/ directory in your PATH + . Clone/install Swift/T +

$ git clone git@github.com:swift-lang/swift-t.git # or $ git clone https://github.com/swift-lang/swift-t.git

+ Follow directions here: + http://swift-lang.github.io/swift-t/guide.html#Build_configuration or http://swift-lang.github.io/swift-t/guide.html#_from_source + Then put the STC and Turbine bin/ directories in your PATH + A certain memory size is required. (16 GB is enough; 1 GB cannot work.) + . Install the MPIX_Launch module: + Follow directions here: + ** https://bitbucket.org/jmjwozniak/mpix_launch_swift . Build the heat_transfer simulator code: +

$ export LD_LIBRARY_PATH=.../evpath/lib (the library of the EVPath installation) $ make

. Build the stage_write program +

$ cd stage_write $ make $ cd -

. Edit run-workflow.sh + Set the LAUNCH variable to the MPIX_Launch_Swift src/ directory (containing pkgIndex.tcl) This allows STC to find the module Set the EVPATH variable to the EVPath installation (the directory specified in chaos_bootstrap.pl) This is needed so that the ADIOS-linked programs can find EVPath at run time . Run ./run-workflow.sh + +

$ ./run-workflow.sh 16 FLEXPATH

+ It should take 20 seconds or less, if it hangs, kill it and check for error messages.

=== DataSpaces:

. Download/install DataSpaces from: + ** Download from: + http://personal.cac.rutgers.edu/TASSL/projects/data/download.html +

./configure --prefix=... CC=mpicc FC=mpif90

+ . Put the DataSpaces bin directory in your PATH . Rebuild ADIOS: reconfigure with --with-dataspaces=... . If you already compiled the simulator (this directory) or stage_write, for each of them do: 'make clean ; make' (because ADIOS was rebuilt) . Edit heat_transfer.xml to enable method="DATASPACES" and disable the other methods. . Run ./run-workflow.sh + but provide more processes +

$ ./run-workflow.sh 17 DATASPACES

+