Open FrankD412 opened 6 years ago
As a note for consideration, I'm thinking about transitioning it to a dictionary where the name of the step is the key. That makes it cleaner instead of having hyphens everywhere, but that's more of an aesthetic.
Here's an example of what a study block might look like:
study:
make-lulesh:
description: Build the serial version of LULESH.
depends: []
run:
cmd: |
cd $(LULESH)
sed -i 's/^CXX = $(MPICXX)/CXX = $(SERCXX)/' ./Makefile
sed -i 's/^CXXFLAGS = -g -O3 -fopenmp/#CXXFLAGS = -g -O3 -fopenmp/' ./Makefile
sed -i 's/^#LDFLAGS = -g -O3/LDFLAGS = -g -O3/' ./Makefile
sed -i 's/^LDFLAGS = -g -O3 -fopenmp/#LDFLAGS = -g -O3 -fopenmp/' ./Makefile
sed -i 's/^#CXXFLAGS = -g -O3 -I/CXXFLAGS = -g -O3 -I/' ./Makefile
make clean
make
resources:
# Not needed, but illustrates the default.
adapter: local
run-lulesh:
description: Run LULESH.
depends: [make-lulesh]
run:
cmd: |
$(LULESH)/lulesh2.0 -s $(SIZE) -i $(ITERATIONS) -p > $(outfile)
resources:
adapter: flux
nodes: 1
cores: 1
tasks: 1
cores per task: 1
post-process-lulesh:
description: Post process all LULESH results.
depends: [run-lulesh_*]
run:
cmd: |
echo "Unparameterized step with Parameter Independent dependencies." >> out.log
echo $(run-lulesh.workspace) > out.log
ls $(run-lulesh.workspace) > ls.log
resources:
# Not needed, but illustrates the default.
adapter: local
I'm considering renaming study
to steps
-- the file as a whole is a study, which I feel like is a little misnomer for the steps of the study.
Discussion has moved to PR #151 -- just as an FYI to those who might be looking for more information.
@gonsie -- this is the issue I mentioned with specification improvements. Feel free to post thoughts on data dependency here.
So I've been looking at some of the limitations of the YAML specification as it is, and I've come up with a new structure for it.
The idea behind splitting out the resources is two fold:
$(resource.<name>)
and so long as the key exists inresources:
it can be used. Users could then "manually" select which MPI they want to use in a fashion similar tompirun -n $(resources.tasks) ... some --command --to run
. Principally, user could even do shell math to computer MPI related values (and Maestro could even calculate high level settings inferring from statically set values).