MRtrix3 / mrtrix3

MRtrix3 provides a set of tools to perform various advanced diffusion MRI analyses, including constrained spherical deconvolution (CSD), probabilistic tractography, track-density imaging, and apparent fibre density
http://www.mrtrix.org
Mozilla Public License 2.0
293 stars 180 forks source link

New commands: connectome2metric, connectomeedit #714

Closed Lestropie closed 2 months ago

Lestropie commented 8 years ago

Command that would duplicate some BCT functionality, generating scalar / node-wise measures based on connectivity matrices. Not something I'm really a fan of, but would probably be well-used. Limit to weighted undirected measures (at least for now).

Functionality could be re-used in a command such as connectomestats, which would import the subject matrices and compute the scalar metric of interest for each before performing permutation testing. (The alternative would be a command that imports subject data as a single vector file, one scalar value for each subject, then run the stats - not sure what to call such a command)

Lestropie commented 8 years ago

Results of discussion regarding how this functionality should behave / be interfaced with.

Now there will be two commands: connectome2metric and connectomeedit. Hopefully the reasons will come out in the description.

connectome2metric

void usage()
{

  ARGUMENTS
  + Argument("matrix_in", "the connectome matrix file(s)").type_file_in().allow_multiple();

  OPTIONS
  + OptionGroup ("Options for selecting the metric(s) to calculate")
    + Option ("nodewise", "Only calculate the requested node-mise metrics (can specify more than one in a comma-separated list); options are: " + join (nodewise_metric_choices, ",")).type_text()
    + Option ("global",   "Only calculate the requested global metrics (can specify more than one in a comma-separated list); options are: " + join (global_metric_choices, ",")).type_text();

  + OptionGroup ("Other options for connectome2metric")
    + Option ("lut", "Import a lookup table to use when labelling rows of the -nodewise output")
      + Argument ("file").type_file_in();

}

Note that we decided on comma-separated strings for -nodewise and global, which means manually checking the validity of each string after comma-splitting, but usage should be neater.

Usage cases:

Node,  Metric #1,   Metric #2, ...
        1,       23.4,        56.7, ...
     Node, Subject #1,  Subject #2, ...
        1,       23.4,        56.7, ...

connectomeedit

Options for editing a connectome, e.g. prior to calculations in connectome2metric:

matteofrigo commented 5 years ago

I have my own implementation of the connectomeedit command in python and I could do a refactoring to fit the mrtrix python APIs.

If I understand correctly, we would ask the original and the filtered LUTs as input, then we would keep only the entries corresponding to the labels that are in both the lists. The order of the labels is assumed to be the one given in the original LUT. Is this correct?

Lestropie commented 5 years ago

That would be awesome @matteofrigo. Targeting the dev branch would be preferable, not only because it's a new feature but also because the Python API has changed and includes convenience functions for loading & saving matrix data.

I had to read the LUT import / export comment a few times to remind myself what the intent there was. I believe the idea was that if one specified that particular nodes were to be removed, one could provide the LUT on which the input matrix was based, and the script would output a modified LUT where the nodes removed from the matrix were also removed from the LUT. This would however require Python compatibility with the wide gamut of possible LUT formats (see here). So omitting that particular idea would be perfectly fine.

I should also note, since it's slightly related to this functionality, that I did commence a feature branch at some point in the past where user preferences such as the storage format for symmetric matrices (i.e symmetric / lower triangular / upper triangular) could be config file options rather than / in addition to command-line options. Looks like the connectome_output_config branch is there, but fairly bare.

matteofrigo commented 5 years ago

I did a first refactoring of my python script that made it compatible with the APIs.

I did not use the convenience functions for loading and saving matrix data, as they do almost the same job as numpy.loadtxt and numpy.savetxt. Since numpy is in the dependencies of mrtrix3, I don't see why we should avoid using it, but if needed I'll rewrite the functions to be compatible with lists.

Lestropie commented 5 years ago

Since numpy is in the dependencies of mrtrix3, I don't see why we should avoid using it, but if needed I'll rewrite the functions to be compatible with lists.

Actually, technically it's not. On master, population_template will use numpy if it's available, or its own functions if not. On dev, that dependency is now removed entirely. Also, the save functions in the mrtrix3.matrix module will automatically insert provenance information. I should remove numpy from the installation instructions in the documentation on dev.

matteofrigo commented 5 years ago

I see. I adapted the functions.

matteofrigo commented 5 years ago

The command history of the processed connectomes is not appended, as the load_matrix function does not consider the metadata contained in the file. This means that on a text matrix only the last operation will be saved in the command history. Do you think this is material for a bug report?

Lestropie commented 5 years ago

Wouldn't make it a bug report, but it could be a requested feature.

The reason we can easily keep full track of provenance in images is because the image data and header key-value pairs are intrinsically tied together in code (Header class), and therefore on construction of a new image the full command history is right there. For text file data, both in C++ and Python, I've been content from #1603 with just writing a single entry in command_history as being a significant step over what we had previously (which was nothing). To have full command provenance in output text files would require explicit handling, both in C++ and Python, to capture and encapsulate such data (KeyValues class in C++, dict in Python) alongside any data that may eventually be written to text file. So it would take some explicit handling, and probably a decent bit of care to make sure such data are utilised in all possible scenarios.