Closed jordandekraker closed 3 years ago
Relatedly, for datasets like this it would be nice to be able to skip the bulk of the preprocessing as well (eg. T1-T2 registration, N4 correction, etc). I wonder if we could add a flag to skip everything except the registration to space-CorObl?
Re: skipping pre-processing (averaging, n4 etc) and T1/T2 co-registration I actually implemented that in the last couple days (still in my dev branch) -- there are --skip_preproc
and --skip_coreg
options there. I can probably do a PR soon to merge this and a bunch of new things actually...
As for a workflow that doesn't require T1w at all, it is doable -- would need to use the T2w image to register to corobl, and then instead of bringing everything to T1w space, would bring it all to T2w space.. I think could make that happen by making the reference space a wildcard (instead of hardcoding to T1w), but overall will be a pretty substantial change.. So for the time being, one hack that could work is updating the config to actually pull in the T2w image in place of the T1w, and doing the same for the CITI corobl template. Would be best to do this with the above mentioned --skip_coreg in place too, so that it doesn't try to register to itself..
Great, thanks.
For the time being I was able to use my workaround with the cropped T1w data.
Also, the specific error message I was getting was just because I had to update snakebids in my environment - something we should remember to do periodically while developing!
also, I just merged in a new PR, has changes from the last few weeks of development now
Awesome!
Another thing I'm noticing since I'm trying to run these cropped images: the parallelization is setup such that one error stops all processes. Perhaps we could have the process continue for other subjects even if one subject fails?
Similarly, I think processing of one hemisphere should continue even if the other hemisphere fails - which would be really helpful since the dataset I'm looking at now had cropped separately around the left/right hippocampi
Yep, you can use the --keep-going flag (is a snakemake option).. If you're running many subjects at once you can also use the cc-slurm profile, the execution groups are set-up so it will submit one slurm job per subject
On Thu, Dec 10, 2020 at 4:43 PM jordandekraker notifications@github.com wrote:
Awesome!
Another thing I'm noticing since I'm trying to run these cropped images: the parallelization is setup such that one error stops all processes. Perhaps we could have the process continue for other subjects even if one subject fails?
Similarly, I think processing of one hemisphere should continue even if the other hemisphere fails - which would be really helpful since the dataset I'm looking at now had cropped separately around the left/right hippocampi
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/khanlab/hippunfold/issues/12#issuecomment-742818815, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACXV2XKRYNLR7E7UHNB2DHDSUE6HXANCNFSM4UVSBIKA .
-- Ali R. Khan, PhD Canada Research Chair in Computational Neuroimaging Assistant Professor, Medical Biophysics Scientist, Robarts Research Institute Schulich School of Medicine & Dentistry, Western University e. alik@robarts.ca t. 519.931.5777 (x24280) w. www.khanlab.ca
Also, the other useful flag is --keep-incomplete so it doesn't delete output from failed rules
On Thu, Dec 10, 2020 at 4:49 PM Ali Khan ali.khan@uwo.ca wrote:
Yep, you can use the --keep-going flag (is a snakemake option).. If you're running many subjects at once you can also use the cc-slurm profile, the execution groups are set-up so it will submit one slurm job per subject
On Thu, Dec 10, 2020 at 4:43 PM jordandekraker notifications@github.com wrote:
Awesome!
Another thing I'm noticing since I'm trying to run these cropped images: the parallelization is setup such that one error stops all processes. Perhaps we could have the process continue for other subjects even if one subject fails?
Similarly, I think processing of one hemisphere should continue even if the other hemisphere fails - which would be really helpful since the dataset I'm looking at now had cropped separately around the left/right hippocampi
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/khanlab/hippunfold/issues/12#issuecomment-742818815, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACXV2XKRYNLR7E7UHNB2DHDSUE6HXANCNFSM4UVSBIKA .
-- Ali R. Khan, PhD Canada Research Chair in Computational Neuroimaging Assistant Professor, Medical Biophysics Scientist, Robarts Research Institute Schulich School of Medicine & Dentistry, Western University e. alik@robarts.ca t. 519.931.5777 (x24280) w. www.khanlab.ca
-- Ali R. Khan, PhD Canada Research Chair in Computational Neuroimaging Assistant Professor, Medical Biophysics Scientist, Robarts Research Institute Schulich School of Medicine & Dentistry, Western University e. alik@robarts.ca t. 519.931.5777 (x24280) w. www.khanlab.ca
Fantastic. Do you think its worth including these flags in the usage text? Or would these be considered more general snakemake options?
Was thinking of exposing some of the more useful snakemake flags in the usage, and also might be useful to have some sensible defaults (maybe by having some built in profiles like cc-slurm), but those are things that should be done in snakebids. We should make an issue there as a placeholder for the time being.
On Thu., Dec. 10, 2020, 4:55 p.m. jordandekraker, notifications@github.com wrote:
Fantastic. Do you think its worth including these flags in the usage text? Or would these be considered more general snakemake options?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/khanlab/hippunfold/issues/12#issuecomment-742824362, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACXV2XK2TOVNJKJFKO4G3S3SUE7UZANCNFSM4UVSBIKA .
Since snakemake flags are now being enumerated on the documentation page, i'm going to close this.
Currently, the pipeline REQUIRES a T1w image in the input BIDSdir, but I'd like to run on a dataset of only TSE slabs (
/scratch/jdekrake/Hippocampal_AutoTop_tests/Generalizability_dataset
on graham). The data is from the ASHS atlas from the recent Berron 7T segmentation paper.Here is my error
which I attribute to this line: https://github.com/khanlab/hippunfold/blob/68a137fa900b7a8d40b6a5760f16928b62209943/workflow/Snakefile#L10
I'm not super familiar, but I think removing this will break further preprocessing (https://github.com/khanlab/hippunfold/blob/master/workflow/rules/preproc_t2.smk)
For the time being I can see a workaround using T1w data that is available, but only in their cropped hippocampal space. @akhanf just wondering if you think it would be worth having the ability to run with no T1w?