This repository release the source-code from "Geolab : Geometric-based tractography parcellation of Superficial white matter" accepted by the ISBI 2023 (https://arxiv.org/abs/2303.01147). It also makes available the altas used in the article.
GeoLab is a tool for superficial white matter parcellation, which improves the RecoBundles framework (made for deep white matter parcellation) to work on superficial white matter. This method outperforms the SOTAs on semi-ground truth (ARCHI dataset) and has a comparable result on the UKBiobank dataset.
The contents of this repository are released under Apache-2.0 license.
Install dependencies.
In ./GeoLab/CMakeLists.txt line 19 ("set(PYTHON_BINARY "/usr/bin/env python3")") replace "/usr/bin/env python3" by your python binary if you are using a virtual environment or if your python binary is not in the default path.
Clone Git repository and compile:
$ git clone https://github.com/vindasna/GeoLab
$ cd GeoLab
$ mkdir build
$ cd build
$ cmake ..
$ make
Configure PATH : Edit the startup ~/.bashrc or /etc/bash.bashrc file manually by adding this line :
$ export PATH=/
$ export ESBA_DIR=/
Check installation :
$ ProjectAtlasGeoLab -h
To extract the bundles of the ESBA atlas from a subject you first need to compute the tractogram (.tck/.trk/.bundles), register it to MNI space (recommended : image-based with ANTs) and resample it to 15 points per fiber. Then use the ProjectAtlasGeoLab command :
$ ProjectAtlasGeoLab -i input${format} -o outputDir -nbPoints 15 -nbThreads ${nbThreads}
First, you'll need to resample your atlas to a fixed number of points per fiber, if your atlas is in .tck format you can use MRTrix's command tckreample.
You'll need to analyse your atlas to get the bundle-specific thresholds, you can do this step once, then the thresholds will be saved in the .minf files :
$ analyseAtlasBundle.py -i atlasDir -f ${format} -r referenceImage.nii
You'll also need to precompute the full atlas (all bundles in one single file), the atlas neighborhood and the atlas centroids :
// Compute full atlas
$ fuseAtlas -i atlasDir -o outDirFullAtlas -f ${format}
// Compute atlas neighborhood
$ computeNeighborhood -i outDirFullAtlas/fullAtlas${format} -o outDirNeighborhoodAtlas -r referenceImage.nii
// Compute atlas centroids
$ computeCentroids -i outDirNeighborhoodAtlas -o outDirCentroidsAtlas -r referenceImage.nii -nbPoints ${nbPoints} -nbThreads ${nbThreads} -f ${format}
Then use the ProjectAtlasGeoLab command :
$ ProjectAtlasGeoLab -i input${format} -a atlasDir -ref referenceImage.nii -o outputDir -nbPoints 15 -an NeigborhoodAtlas -anc CentroidsAtlas -nbThreads ${nbThreads}
Your labelled data should be in the form of two files :
.txt -> labels for each fiber in the form of :
fiber_index_k : label_i
...
fiber_index_l : label_j
With :
label_i, ..., label_j integers.
fiber_index_l is the index of the fiber in the tractogram used as input for segmentation.
If a fiber has multiple labels you just need to have several lines for that fiber.
.dict -> dictionary for the labels in the form of :
label_name_i : label_i
...
label_name_j : label_j
With :
label_i, ..., label_j the same integers as in the .txt.
label_name_i, ..., label_name_j the names of the labels.
Once you have those files you can use the following command :
$ scoresPredictedSGT.py -pl labels.txt -pd labels.dict -tl trueLabels.txt -td trueLabels.dict -o outDir
With :
If you want to reproduce the result of the paper, the semi-ground truth is in the SGT folder.
First you need to extract the features for SupWMA with extract_bundles_feat.py command :
$ extract_bundles_feat.py -i SGT.bundles -o outSGT.h5 -v 1
With :
Then you can use the applySupWMA command :
$ applySupWMA.py -t tractogram.bundles -f tractogram.h5 -ep encoderParameters.pickle -ew encoderWeights.pth -cw classifierWeights.pth -ln labelNames.h5 -ld labelsDictSupWMA.txt -spw SupWMA_path -o outDir
With :
Currently, GeoLab is available as a docker container upon request (email : nabil.vindas@cea.fr).
The atlases are available upon request (email : nabil.vindas@cea.fr)