bids-apps / MRtrix3_connectome

Generate subject connectomes from raw BIDS data & perform inter-subject connection density normalisation, using the MRtrix3 software package.
http://www.mrtrix.org/
Apache License 2.0
50 stars 26 forks source link

Aligning multiple processing streams #50

Open araikes opened 5 years ago

araikes commented 5 years ago

Hi @Lestropie, I have a grand plan to run a multimodal analysis using T1 information (particularly cortical thickness and gray matter density), rsFMRI, and DTI/tractography using the HCP MMP1 atlas. To that end, my analysis plan looks like this:

  1. BIDS-Apps/Freesurfer's recon-all, particularly for quality metrics and and any future inquiries I may have.
  2. fMRIPREP, loading in the pre-run Freesurfer.
  3. XCP Engine (https://github.com/PennBBL/xcpEngine) for atlas-based ANTs cortical thickness on the fMRIPREP'd T1 images and atlas-based functional connectivity using the fMRIPREP'd rsFMRI and the XCP post-processed T1s (for coregistration).
  4. DWI preprocessing and connectome construction using MRtrix3_connectome.

My rationale for this ordering is to minimize duplicate (and potentially disparate processing) streams, particularly for the T1s which will get processed with all of these steps (ANTs cortical thickness in XCP, 5ttgen in MRtrix3) .

So here's my question: Is there a way for me to pipe the pre-run Freesurfer (for parcellation and labeling purposes) and the fMRIPREP preprocessed T1s (along with the brain mask) into MRtrix3_connectome as the target for 5ttgen or will I need to follow some instructions (like the BATMAN tutorial) using these images apart from the docker environment?

If I have to do that, if I place those images in the appropriate derivatives folders, will MRtrix3_connectome detect the existence of those files and not try to rerun the anatomical processing stream?

I hope that's clear enough what my question is.

Thanks

Lestropie commented 5 years ago

Hi @araikes,

Currently there's no way of exploiting data processed using other tools as input to this script. There is of course nothing stopping you from creating a fork of the tool and making tailored modifications specific for your own purposes.

Longer-term, enabling this sort of inter-application data utilisation is one of the principle goals of BIDS Derivatives. There needs to be a robust definition of how such data is to be stored by one application in order for another application to be able to read and utilise such with any confidence. While you could make specialised modifications to dump data from one specific tool in a specific location in a specific format and for some other tool to then read from that location, it would not be a generalised solution. Probably once BIDS derivatives is merged into the main specification, I would then look into getting this tool to scan the output directory for pipelines that may have provided a pre-processed T1 and possibly a parcellation; but unfortunately it's not something I can prioritise right now.

If you're interested in making modifications yourself I can certainly point you in the right direction with respect to the how and where.

Cheers Rob

araikes commented 5 years ago

Hi @Lestropie Thanks for the response. I kind of figured that, at least for the time being, it would be a somewhat hacked together at best.

As I have time over the next little bit, I'll fork the repo and see if I can figure out a way to use other preprocessed outputs.

Thanks for the great tool.

araikes commented 5 years ago

One other question:

Is there a non-hacky way to do multi-session processing? Currently, it tells me that it can't find the dwi/fmap folders.

Lestropie commented 5 years ago

Currently the script is simply hard-wired assuming one session per subject. Navigating the BIDS input directory to determine e.g. multi-session structure is something I'd like to add, I just hadn't explicitly listed it until now (#51). So it's possible to have a structure that adequately conforms to BIDS but the script won't accept; the former is much more difficult technically than the latter.

You can just create a unique subject ID per session and run the script that way.

araikes commented 5 years ago

@Lestropie I'll hack my way around it and if I find some time I'll try to edit the code to use pybids to grab the structure.

Thanks

araikes commented 5 years ago

Hi @Lestropie, I have a kind of janky version of your script that will allow for longitudinal studies as well as reusing fMRIPrep pre-processed T1s. It certainly can be improved to use pybids, but it appears serviceable (running it right now and it's running eddy currently).

Here's my current question: How usable is this script without mtnormalize functionality in it? If I run it to completion on my subjects and run the group steps, is it going to give me invalid/unusable findings?

Thanks

Lestropie commented 5 years ago

The functionality of the script following use of the group-level analysis is almost equivalent to how inter-subject intensity normalisation was achieved prior to creation of the precursor to the mtnormalise command. So it's no more invalid than every AFD analysis that was done prior to the creation of that method. It's simply a different mechanism being used for intensity normalisation, and each has their pros and cons.

I've had a manuscript accepted that demonstrates the necessity for appropriate connection density normalisation, and another I've drafted and am waiting for feedback on that describes and proves it in more detail. So hopefully those will clarify things soon. Eventually I'll incorporate mtnormalise in here, but it'll take a little more trickery to do it right, and that will probably wait for when I write a publication on this tool.

Ultimately the cross-subject variance within structural connectome data due to the inherent poor reproducibility of tractography probably swamps any potential variance from the intensity normalisation method.

araikes commented 5 years ago

Hi @Lestropie, It's good to know that I can use those outputs. I'm very broadly speaking new to MRtrix so understanding that AFD from this script will allow me to achieve what I'm looking for. My question was originally borne out of the code comments you had as well as the BATMAN tutorial.

Thanks again for all the help.