ReproNim / simple_workflow-02

1 stars 2 forks source link

Add ANTs labeling protocol #8

Closed satra closed 5 years ago

satra commented 6 years ago

i've asked the ants folks for approaches they use to label brain structures.

https://github.com/ANTsX/ANTs/issues/587

for simple 2, if we want to be as good as possible we should use the better method (1). i'll implement and test that in the container i'm building.

i'll add container details to #5.

satra commented 6 years ago

@dnkennedy - if you get details of the ABIDE protocol do post it here.

satra commented 6 years ago

this is the current nipype workflow on top of antsCorticalThickness.sh output to label and extract measures

from glob import glob
import os
from nipype import Workflow, MapNode, Node
from nipype.interfaces.ants import ApplyTransforms, AntsJointFusion, LabelGeometry
from nipype.utils.misc import human_order_sorted

T = glob('/data/out/ants_subjects/arno/antsTemplateToSubject*')[::-1]
ref = '/data/T1.nii.gz'
mask = '/data/out/ants_subjects/arno/antsBrainExtractionMask.nii.gz'
T1s = human_order_sorted(glob('/opt/data/OASIS-TRT_brains_to_OASIS_Atropos_template/*.nii.gz'))
labels = human_order_sorted(glob('/opt/data/OASIS-TRT_labels_to_OASIS_Atropos_template/*.nii.gz'))
thickness = '/data/out/ants_subjects/arno/antsCorticalThickness.nii.gz'
N = 20

wf = Workflow('labelflow')
transformer = MapNode(ApplyTransforms(), iterfield=['input_image'], name="transformer")
transformer.inputs.reference_image = ref
transformer.inputs.transforms = T
transformer.inputs.input_image = T1s[:N]
transformer.inputs.dimension = 3
transformer.inputs.invert_transform_flags = [False, False]

transformer_nn = MapNode(ApplyTransforms(), iterfield=['input_image'], name="transformer_nn")
transformer_nn.inputs.reference_image = ref
transformer_nn.inputs.transforms = T
transformer_nn.inputs.dimension = 3
transformer_nn.inputs.invert_transform_flags = [False, False]
transformer_nn.inputs.input_image = labels[:N]
transformer_nn.inputs.interpolation = 'NearestNeighbor'

labeler = Node(AntsJointFusion(), name='labeler')
labeler.inputs.dimension = 3
labeler.inputs.target_image = [ref]
labeler.inputs.out_label_fusion = 'label.nii.gz'
labeler.inputs.mask_image = mask
labeler.inputs.num_threads = 8
wf.connect(transformer, 'output_image', labeler, 'atlas_image')
wf.connect(transformer_nn, 'output_image', labeler, 'atlas_segmentation_image')

tocsv = Node(LabelGeometry(), name='get_measures')
tocsv.inputs.intensity_image = thickness
wf.connect(labeler, 'out_label_fusion', tocsv, 'label_image')

wf.base_dir = os.getcwd()  

wf.config['monitoring'] = {'enabled': True}
wf.run('MultiProc')
satra commented 6 years ago

@binarybottle - what would you think about adding the above to the mindboggle123 script?

this would:

  1. label the T1 using joint fusion
  2. generate ants based volume and thickness info (using the OASIS 20 atlases)

if that sounds good to you, i will send a PR to add this in. i would also augment the first transformer to use BSpline.

binarybottle commented 6 years ago

Sounds great, @satra!

  1. Mindboggle already generates a joint-fusion-labeled T1. Would people be confused by multiple joint-fusion-labelings of the same T1?

  2. LabelGeometry is another issue, so including its output would be great!

satra commented 6 years ago

@binarybottle - i'm not sure mindboggle generates a joint-fusion labeled T1, since the 20 atlases are not included anywhere - is there a function that downloads them and applies joint fusion?

you may be talking about just registering the joint fusion created atlas to the individual T1, which is different from applying joint fusion using multiple atlases to the T1.

binarybottle commented 6 years ago

Yes, sorry -- thank you for clarifying. Looking forward to a PR!

satra commented 5 years ago

this is also done.