Open yarikoptic opened 6 years ago
@yarikoptic - thanks for comments. What about we have a short call some time this week (but not Fr) after Satra gets to some source of internet. Satra was considering showing some nipype examples, but we thought that we might prefer to add some testing, at the end I had a feeling that we will not have time for testing (but might be wrong). about script, remember that Michael had bash scripts (e.g. this)
datalad rerun --since= --script myanalysis.sh
) does generate the entire analysis script as was recorded using datalad run
.
This is a continuation to https://github.com/ReproNim/sfn2018-training/pull/5, and I would like take my last comment back somewhat: I do not "like" the reprozip sub-section either at the beginning or the end of workflows. @satra might like to chime in to agree or disagree.
I feel that reprozip should be introduced within ReproEnv module since that is what it does - captures the environment so it could be reused. Note also that the slides seems to pile up too much in the
reprozip
example commands -datalad run
usingsingularity
to runreprozip
so it captures what that singularity container inside already is... what is the point? how realistic (besides getting a minimized bet-specific image) such use case would be? If I already have a singularity image I would just use it directly. IMHO reprozip example shouldn't use any of datalad or singularity to demonstrate how current environment could be captured into a container (good thing to show after neurodocker), which could be executed (via docker or singularity).Then workflows section IMHO should be more about how those individual steps and environments (which we established via neurodocker, or reprozip or otherwise) could be brought together (it is "Neuroimaging workflows" section after all). And that is what that presentation seems to start with (e.g. Example dataflows) but then goes back into "tracing". I think instead it should expand on how to glue those pieces together, and do "a workflow" (i.e. not just a single
bet
command). And that is actually what we do in the following sections (ReproIn, DataLad GLM) via git/datalad "manually" going through all the steps from raw data conversion... That becomes a demonstration of "manual workflow" (using those containers we learned/created) and recording history in git. Withdatalad rerun
it is kinda cool, but generally we do not want to run all workflow steps separately. So then we could go into a Nipype workflow which automates such complex workflows so a single command does a lot! Then it could have been demonstrated running an entire (well -- again just some basic) workflow using the samedatalad containers-run
but calling nipype workflow thus making section complete.So, I will leave it to you guys (@djarecka and @satra) to decide, meanwhile I will fixup the reproin/datalad subsections without changing any timing or removing any other section, but work to be done there!