daducci / COMMIT

Linear framework to combine tractography and tissue micro-structure estimation with diffusion MRI
Other
45 stars 33 forks source link

COMMIT2 on QSIPrep/QSIRecon Outputs #110

Closed smeisler closed 1 year ago

smeisler commented 2 years ago

Hello,

I am trying to run COMMIT2 on preprocessed DWI and tractograms from QSIPrep, using the AAL atlas registered to native space. I get the following error when adapting the tutorial.

Code input

# import packages
import os
import numpy as np
import commit
from commit import trk2dictionary
import scipy.io
import amico
#commit.setup()

# Get names of relevant directories
bids = '/om4/group/gablab/data/HBN/'
derivatives = os.path.join(bids,'derivatives')
qsiprep_dir = os.path.join(derivatives,'qsiprep')
qsirecon_dir = os.path.join(derivatives,'qsirecon')
commit_outdir = os.path.join(derivatives,'commit_testing')
if not os.path.exists(commit_outdir):
    os.makedirs(commit_outdir)

# Get qsiprep/qsirecon output
sub = 'sub-NDARAA306NT2'
dwi_path = os.path.join(qsiprep_dir,sub,'dwi',sub+'_acq-64dir_space-T1w_desc-preproc_dwi.nii.gz')
bval_path = os.path.join(qsiprep_dir,sub,'dwi',sub+'_acq-64dir_space-T1w_desc-preproc_dwi.bval')
bvec_path = os.path.join(qsiprep_dir,sub,'dwi',sub+'_acq-64dir_space-T1w_desc-preproc_dwi.bvec')
wm_mask_path =  os.path.join(qsiprep_dir,sub,'anat',sub+'_label-WM_mask.nii.gz')
tractogram_path = os.path.join(qsirecon_dir,sub,'dwi',
                               sub+'_acq-64dir_space-T1w_desc-preproc_space-T1w_desc-tracks_ifod2.tck')
atlas_path = os.path.join(qsirecon_dir,sub,'dwi',
                               sub+'_acq-64dir_space-T1w_desc-preproc_space-T1w_desc-aal116_atlas.mif.gz') # AAL atlas

# Get only connecting fibers
os.system(('/om2/user/smeisler/anaconda3/envs/commit/bin/tck2connectome -force -nthreads 0 -assignment_radial_search 2 -out_assignments fibers_assignment.txt ' + tractogram_path + ' ' + atlas_path + ' connectome.csv'))
os.system(( '/om2/user/smeisler/anaconda3/envs/commit/bin/connectome2tck -force -nthreads 0 -exclusive -files per_edge -keep_self ' + tractogram_path + ' fibers_assignment.txt bundles/bundle_' ))
os.system('/om2/user/smeisler/anaconda3/envs/commit/bin/tckedit -force -nthreads 0 bundles/bundle*.tck fibers_connecting.tck' )

# Define dictionary
trk2dictionary.run(
    filename_tractogram = 'fibers_connecting.tck',
    filename_mask       = wm_mask_path)

amico.util.fsl2scheme( bval_path, bvec_path, 'DWI.scheme' )

# load the data
mit = commit.Evaluation( '.', '.' )
mit.load_data( dwi_path, 'DWI.scheme' )

# use a forward-model with 1 Stick for the streamlines and 2 Balls for all the rest
mit.set_model( 'StickZeppelinBall' )
d_par       = 1.7E-3             # Parallel diffusivity [mm^2/s]
d_perps_zep = []                 # Perpendicular diffusivity(s) [mm^2/s]
d_isos      = [ 1.7E-3, 3.0E-3 ] # Isotropic diffusivity(s) [mm^2/s]
mit.model.set( d_par, d_perps_zep, d_isos )

mit.generate_kernels( regenerate=True )
mit.load_kernels()

# create the sparse data structures to handle the matrix A
mit.load_dictionary( 'COMMIT' )
##### CODE DOES NOT GET PAST HERE #####

mit.set_threads()
mit.build_operator()

# perform the fit
mit.fit( tol_fun=1e-3, max_iter=1000, verbose=False )
mit.save_results( path_suffix="_COMMIT1" )

# Retrieve the streamline contributions estimated by COMMIT for later use in COMMIT2:
x_nnls, _, _ = mit.get_coeffs( get_normalized=False )

Terminal Output (beginning from amico.util.fsl2scheme)

amico.util.fsl2scheme( bval_path, bvec_path, 'DWI.scheme' )
-> Writing scheme file to [ DWI.scheme ]
'DWI.scheme'
>>> 
>>> # load the data
>>> mit = commit.Evaluation( '.', '.' )
>>> mit.load_data( dwi_path, 'DWI.scheme' )

-> Loading data:
    * Acquisition scheme:
        - diffusion-weighted signal
        - 129 samples, 2 shells
        - 1 @ b=0, 64 @ b=1000.0, 64 @ b=2000.0
    * Signal dataset:
        - dim    : 162 x 193 x 164 x 129
        - pixdim : 1.200 x 1.200 x 1.200
        - values : min=0.00, max=1144.45, mean=11.99
   [ 20.4 seconds ]

-> Preprocessing:
    * Normalizing to b0... [ min=0.00, max=308603.06, mean=0.49 ]
   [ 1.4 seconds ]
>>> 
>>> # use a forward-model with 1 Stick for the streamlines and 2 Balls for all the rest
>>> mit.set_model( 'StickZeppelinBall' )
>>> d_par       = 1.7E-3             # Parallel diffusivity [mm^2/s]
>>> d_perps_zep = []                 # Perpendicular diffusivity(s) [mm^2/s]
>>> d_isos      = [ 1.7E-3, 3.0E-3 ] # Isotropic diffusivity(s) [mm^2/s]
>>> mit.model.set( d_par, d_perps_zep, d_isos )
>>> 
>>> mit.generate_kernels( regenerate=True )

-> Simulating with "Stick-Zeppelin-Ball" model:
   |������������������������������������������������������������                                       | 3   |������������������������������������������������������������������������������������������������������������������������������������������������������������������������������| 100.0%
   [ 2.0 seconds ]
>>> mit.load_kernels()

-> Resampling LUT for subject ".":
    * Keeping all b0 volume(s)...
   |������������������������������������������������������������                                       | 3   |������������������������������������������������������������������������������������������������������   |������������������������������������������������������������������������������������������������������   |������������������������������������������������������������������������������������������������������������������������������������������������������������������������������| 100.0%
      [ OK ]
    * Normalizing... [ OK ]
   [ 3.7 seconds ]
>>> 
>>> # create the sparse data structures to handle the matrix A
>>> mit.load_dictionary( 'COMMIT' )

-> Loading the dictionary:
[ WARNING ] Dictionary does not have the same geometry as the dataset
    * Segments from the tracts... [ 8328748 fibers and 1495561589 segments ]
    * Segments from the peaks...  [ 0 segments ]
    * Isotropic contributions...  [ 446120 voxels ]
    * Post-processing...          Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "commit/core.pyx", line 359, in commit.core.Evaluation.load_dictionary
  File "commit/core.pyx", line 497, in commit.core.Evaluation.load_dictionary
IndexError: index 5138709 is out of bounds for axis 0 with size 5127624

Please let me know what other information I can provide to help.

Thanks, Steven

smeisler commented 2 years ago

Is it a problem that the WM mask has a higher spatial resolution than the diffusion data?

daducci commented 2 years ago

Hi @smeisler ,

indeed I cannot see the output of the trk2dictionary part, but yes, I suspect it's a problem of images having different size. Could you try with a mask with the same geometry of your DWI data?

smeisler commented 2 years ago

👍 Will do later this week and report back

daducci commented 2 years ago

Hi @smeisler ,

any update on this issue? Can we close it?

Cheers, Ale

cicadawing commented 1 year ago

Hi there!

It's been a while since this was last discussed. I am also trying to use QSIPrep and QSIRecon output for COMMIT/COMMIT2 - and am just wondering whether any progress was made on this issue.

Cheers T

smeisler commented 1 year ago

Hi @cicadawing: Sorry, forgot to follow up on this. It has been so long I don't know if this issue is even still relevant, are you running into this same issue when you try?

daducci commented 1 year ago

Hi all,

Not sure if the issue is still relevant (perhaps it has been solved meanwhile) but it'd be interesting to know the causes and, perhaps, post the solution for others to know. We're glad to help in case!

cicadawing commented 1 year ago

Hi all,

Apologies for the delay. I had been tinkering to try to get something working and was consistently bumping into issues - but I believe that I have established a bit of a workflow now that takes things a bit further than before :)

I would really appreciate your thoughts on the current workflow:

**# Firstly, register the white matter probseg mask to the processed DWI in QSIPrep **

flirt -in ${qsiprep_dir}/sub-${s}/anat/sub-${s}_label-WM_probseg.nii.gz -ref ${qsiprep_dir}/sub-${s}/dwi/sub-${s}_dir-RL_space-T1w_desc-preproc_dwi.nii.gz -out ${commit_out}/flirt_sub-${s}_label-WM_probseg.nii.gz

**# Binarise flirted WM mask**

fslmaths ${commit_out}/flirt_sub-${s}_label-WM_probseg.nii.gz -thr 0.5 -bin ${commit_out}/flirt_binarised_sub-${s}_label-WM_probseg.nii.gz

Use this binarised/flirted WM mask for the WM mask in COMMIT. Then, I run (with Schaefer400 atlas);

import os
import numpy as np
import commit
from commit import trk2dictionary
import [scipy.io](http://scipy.io/)
import amico
import argparse

parser = argparse.ArgumentParser(
    description='Run COMMIT2')
parser.add_argument('--subj_id', default=None, type=str, help="Subj number")
parser.add_argument('--model_choice', default=None, type=str, help="Choose your model: Stick-Zeppelin-Ball, CylinderZeppelinBall")
parser.add_argument('--qsiprep_output_path', default=None, type=str, help="Path to qsiprep output dwi")
parser.add_argument('--qsirecon_output_path', default=None, type=str, help="Path to qsiprep output anat")
parser.add_argument('--atlas_type', default=None, type=str, help="atlas_type")

# Get names of relevant directories
args = parser.parse_args()
qsiprep_dir = args.qsiprep_output_path
qsirecon_dir = args.qsirecon_output_path
atlas_type=args.atlas_type
sub=args.subj_id
model_choice=args.model_choice

os.chdir(os.join.path(qsirecon_dir,'COMMIT',f'sub-{sub}')
commit_path = os.chdir(os.join.path(qsirecon_dir,'COMMIT',f'sub-{sub}'))
dwi_path = os.path.join(qsiprep_dir,f'sub-{sub}','dwi',f'sub-{sub}_dir-RL_space-T1w_desc-preproc_dwi.nii.gz')
bval_path = os.path.join(qsiprep_dir,f'sub-{sub}','dwi',f'sub-{sub}_dir-RL_space-T1w_desc-preproc_dwi.bval')
bvec_path = os.path.join(qsiprep_dir,f'sub-{sub}','dwi',f'sub-{sub}_dir-RL_space-T1w_desc-preproc_dwi.bvec')
wm_mask_path =  os.path.join(commit_path,f'flirt_binarised_sub-{sub}_label-WM_probseg.nii.gz')
tractogram_path = os.path.join(qsirecon_dir,f'sub-{sub}','dwi',
                               f'sub-{sub}_dir-RL_space-T1w_desc-preproc_desc-tracks_ifod2.tck')
atlas_path = os.path.join(qsirecon_dir,f'sub-{sub}','dwi',
                               f'sub-{sub}_dir-RL_space-T1w_desc-preproc_desc-{atlas_type}_atlas.mif.gz') # Schae400 atlas

os.chdir(commit_path)

# Get only connecting fibers
if not os.path.isfile( 'connectome.csv' ) :
    os.system(('tck2connectome -force -nthreads 0 -assignment_radial_search 2 -out_assignments fibers_assignment.txt ' + tractogram_path + ' ' + atlas_path + ' connectome.csv'))

if not os.path.isdir( 'bundles' ) :
    os.mkdir('bundles')
    os.system(( f'connectome2tck -force -nthreads 0 -exclusive -files per_edge -keep_self {tractogram_path} fibers_assignment.txt bundles/bundle_' ))

if not os.path.isfile( 'fibers_connecting.tck' ) :
    C = np.loadtxt( 'connectome.csv', delimiter=' ' ) # NB: change 'delimiter' to suits your needs
    CMD = 'tckedit -force -nthreads 0'
    for i in range(C.shape[0]):
        CMD_i = 'tckedit -force -nthreads 0'
        for j in range(i,C.shape[0]):
            if C[i,j] > 0 :
                CMD_i += ' bundles/bundle_%d-%d.tck' %(i+1,j+1)
        os.system( CMD_i + ' bundles/fibers_connecting_%d.tck' % (i+1) )
        CMD += ' bundles/fibers_connecting_%d.tck' %(i+1)
    os.system( CMD + ' fibers_connecting.tck' )

# Define dictionary
trk2dictionary.run(
    filename_tractogram = 'fibers_connecting.tck',
    filename_mask       = wm_mask_path)

amico.util.fsl2scheme( bval_path, bvec_path, 'DWI.scheme' )

# load the data
mit = commit.Evaluation( '.', '.' )
mit.load_data( dwi_path, 'DWI.scheme' )

# use a forward-model with 1 Stick for the streamlines and 2 Balls for all the rest
mit.set_model( model_choice )
d_par       = 1.7E-3             # Parallel diffusivity [mm^2/s]
d_perps_zep = []                 # Perpendicular diffusivity(s) [mm^2/s]
d_isos      = [ 1.7E-3, 3.0E-3 ] # Isotropic diffusivity(s) [mm^2/s]
mit.model.set( d_par, d_perps_zep, d_isos )

mit.generate_kernels( regenerate=True )
mit.load_kernels()

# create the sparse data structures to handle the matrix A
mit.load_dictionary( 'COMMIT' )

mit.set_threads()
mit.build_operator()

# perform the fit
mit.fit( tol_fun=1e-3, max_iter=1000, verbose=True )
mit.save_results( path_suffix="_COMMIT1" )

# Retrieve the streamline contributions estimated by COMMIT for later use in COMMIT2:
x_nnls, _, _ = mit.get_coeffs( get_normalized=False )

#START COMMIT2

C = np.loadtxt( 'connectome.csv', delimiter=' ' )
print(C)
C = np.triu( C ) # be sure to get only the upper-triangular part of the matrix
group_size = C[C>0].astype(np.int32)
length_g_size= len(group_size)

tmp = np.insert( np.cumsum(group_size), 0, 0 )
group_idx = np.array( [np.arange(tmp[i],tmp[i+1]) for i in range(len(tmp)-1)], dtype=np.object_ )

group_w = np.empty_like( group_size, dtype=np.float64 )
for k in range(group_size.size) :
    group_w[k] = np.sqrt(group_size[k]) / ( np.linalg.norm(x_nnls[group_idx[k]]) + 1e-12 )

reg_lambda = 5e-4 # Determine this empirically

prior_on_bundles = commit.solvers.init_regularisation(
    mit,
    regnorms    = [commit.solvers.group_sparsity, commit.solvers.non_negative, commit.solvers.non_negative],
    structureIC = group_idx,
    weightsIC   = group_w,
    lambdas     = [reg_lambda, 0.0, 0.0]
)

mit.fit( tol_fun=1e-3, max_iter=1000, regularisation=prior_on_bundles, verbose=False )
mit.save_results( path_suffix=f"_COMMIT2_sub-{suj_id}" )

I get pretty good model fit:

-> Saving results to "Results_StickZeppelinBall_COMMIT1/*": [0m
* Fitting errors:
- RMSE...  [ 0.087 +/- 0.104 ]
- NRMSE... [ 0.205 +/- 0.103 ]
* Voxelwise contributions:
- Intra-axonal... [ OK ]
- Extra-axonal... [ OK ]
- Isotropic...    [ OK ]
* Configuration and results:
- streamline_weights.txt... [ OK ]
- results.pickle...         [ OK ]

And, in inspecting the output nifti files, things seem to be working well for COMMIT1.

I do get an issue when running COMMIT2 now, though:

    [0;32m* Exporting EC compartments: [0m
     [ 0 voxels, 0 segments ]
Traceback (most recent call last):
  File "path/to/.py", line 132, in <module>
    mit.fit( tol_fun=1e-3, max_iter=1000, regularisation=prior_on_bundles, verbose=False )
  File "path/to/.py", line 144, in init_regularisation
    structureIC = np.array(newStructureIC)
ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (76282,) + inhomogeneous part.

I also occasionally also get (for some subjects) - but there is no indication from the QSIPrep/Recon output that there is anything going wrong in the processing workflow at that stage:

tck2connectome: [WARNING] The following nodes do not have any streamlines assigned:
tck2connectome: [WARNING] 325, 327
tck2connectome: [WARNING] (This may indicate a poor registration)

Thank you so much in advance, and let me know if I can add anymore info here.

Cheers

T

cicadawing commented 1 year ago

Hi all, I was just wondering whether anybody had had the chance to look at this issue? Cheers T

nightwnvol commented 1 year ago

Hi @cicadawing,

thanks for reporting this issue!

We fixed it, and if you would like to try it, your help would be greatly appreciated. To install the updated version, simply run pip install git+https://github.com/daducci/COMMIT.git@release/v2.0.1.

cicadawing commented 1 year ago

Thank you so much! Keen to investigate this.

Unfortunately, when I attempt to run the above, I now get an error with installation.

  Building wheel for dmri-commit (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for dmri-commit (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [31 lines of output]
      /tmp/pip-build-env-cnf8gjy2/overlay/lib/python3.9/site-packages/setuptools/dist.py:498: SetuptoolsDeprecationWarning: Invalid dash-separated options
      !!

              ********************************************************************************
              Usage of dash-separated 'description-file' will not be supported in future
              versions. Please use the underscore name 'description_file' instead.

              This deprecation is overdue, please update your project and remove deprecated
              calls to avoid build errors in the future.

              See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
              ********************************************************************************

      !!
        opt = self.warn_dash_deprecation(opt, section)
      /tmp/pip-build-env-cnf8gjy2/overlay/lib/python3.9/site-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
      !!

              ********************************************************************************
              The license_file parameter is deprecated, use license_files instead.

              By 2023-Oct-30, you need to update your project and remove deprecated calls
              or your builds will no longer be supported.

              See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
              ********************************************************************************

      !!
        parsed = self.parsers.get(option_name, lambda x: x)(value)
      running bdist_wheel
      error: error in 'egg_base' option: './build' does not exist or is not a directory
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for dmri-commit
Failed to build dmri-commit
ERROR: Could not build wheels for dmri-commit, which is required to install pyproject.toml-based projects

I have had no previous issues with COMMIT installation and setup on the HCP system I work with (while using python 3.9)

The issue persists with this command as well:

pip install dmri-commit

nightwnvol commented 1 year ago

Hi @cicadawing,

sorry for the inconvenience. There was a problem with some build options, but it should be fixed now. Please try again reinstalling the package with the command pip install git+https://github.com/daducci/COMMIT.git@release/v2.0.1.

cicadawing commented 1 year ago

Hi @nightwnvol - I am pleased to report that COMMIT2 is now running - and that I have output :)

Unfortunately, I occasionally get an issue with missing streamlines during tck2connectome - but I believe that this is an issue on my end.

Thanks again - you have been a huge help :)

fullbat commented 1 year ago

Hi @nightwnvol - I am pleased to report that COMMIT2 is now running - and that I have output :)

Unfortunately, I occasionally get an issue with missing streamlines during tck2connectome - but I believe that this is an issue on my end.

Thanks again - you have been a huge help :)

Hi @cicadawing, glad to hear we helped you :) If I may suggest, a quick attempt you could try to recover some missing streamlines is to set the parameter assignment_radial_search to a greater value (e.g. 3 or 4, I wouldn't go higher) to see if some of them stopped just before reaching the ROI. If the problem persists, I recommend checking whether the atlas you are using is properly registered. If that's the case, consider trying different tracking parameters or algorithms.