Open jacobtfisher opened 3 years ago
Hi @jacobtfisher it is difficult to use -m with singularity or docker images but there is a trick here https://xcpengine.readthedocs.io/containers/index.html#using-slurm-to-parallelize-across-subjects ?
Hi @a3sha2 -- thanks for getting back to me! I'm looking at the page now and I'm not seeing anything about the -m
option. Maybe I'm looking in the wrong place. Splitting up the cohort files to run in parallel is what I'm doing right now, but that option doesn't create the groupwise analyses (unless I'm doing something incorrectly). I'd like to be able to see the qcfc
output for all subjects but run the job in parallel.
Thanks again!
I get you @jacobtfisher. You cant sue -m flag with singularity or docker image. You can still run qcfc
after you completion of xpcengine for all the subjects . We add utility command that does that.
https://xcpengine.readthedocs.io/utils/qcfc.html#qcfc
if you are using docker image:
--entrypoint /xcpEngine/utils/qcfc.R \
pennbbl/xcpEngine:latest \
–c <cohort> \
-o <output root> \
[-s <multiple comparisons correction> \
-t <significance threshold> \
-n <confound> \
-y <conformula>]
if you are suing singularity image, it is the same way
singularity exec xcpengine.simg /xcpEngine/utils/qcfc.R \
–c <cohort> \
-o <output root> \
[-s <multiple comparisons correction> \
-t <significance threshold> \
-n <confound> \
-y <conformula>]
Hi there. Thanks for putting together such a useful tool. I just have a quick question about the
-m
flag. The description provided says:I'd like to be able to use this flag rather than breaking up the cohort file, since (I assume) it would let me run
xcpEngine
in parallel while still generating the group level outputs (which don't seem to work when the cohort file is split into individual subjects for running in parallel). What sort of file would I specify in order to use the-m
flag? Is this option compatible withSLURM
?Thank you!