neurospin / pypreprocess

Preprocessing scripts for neuro imaging
105 stars 66 forks source link

.. -- mode: rst --

.. image:: https://travis-ci.org/neurospin/pypreprocess.svg?branch=master :target: https://travis-ci.org/neurospin/pypreprocess :alt: Build Status

.. image:: https://coveralls.io/repos/dohmatob/pypreprocess/badge.svg?branch=master :target: https://coveralls.io/r/dohmatob/pypreprocess?branch=master

.. image:: https://circleci.com/gh/neurospin/pypreprocess/tree/master.svg?style=shield&circle-token=:circle-token :target: https://circleci.com/gh/neurospin/pypreprocess/tree/master

pypreprocess

pypreprocess is a collection of python scripts for preprocessing fMRI data (motion correction, spatial normalization, smoothing, ...). It provides:

pypreprocess relies on nipype's interfaces to SPM (both precompiled SPM and matlab-dependent SPM flavors). It also has pure-Python (no C extensions, no compiled code, just Python) modules and scripts for slice-timing correction, motion correction, coregistration, and smoothing, without need for nipype or matlab. It has been developed in the Linux environment and is tested with Ubuntu 16 and 18. No guarantees with other OSes.

License

All material is Free Software: BSD license (3 clause).

Important links

Dependencies

Installation

To begin with, you may also want to install the pre-compiled version of SPM (in case you don't have matlab, etc.). From the main pypreprocess directory, run the following::

 $ . continuous_integration/install_spm12.sh

Second, install the python packages pip, scipy, pytest, nibabel, scikit-learn, nipype, pandas, matplotlib, nilearn and configobj. If you have a python virtual environment, just run::

 $ pip install scipy scikit-learn nibabel nilearn configobj coverage pytest matplotlib pandas nipype

If not, make sure to install pip (run: 'sudo apt-get install python-pip'). If you want to install these locally, use the --user option::

 $ pip install scipy scikit-learn nibabel nilearn configobj coverage pytest matplotlib pandas nipype --ignore-installed --user

If you want to install these for all users, use sudo::

 $ pip install scipy scikit-learn nibabel nilearn configobj coverage pytest matplotlib pandas nipype --ignore-installed

Finally, install pypreprocess itself by running the following in the pypreprocess::

 $ python setup.py install --user

or simply 'python setup.py install' in a virtual environment.

After Installation: Few steps to configure SPM on your own device

There are three cases:

Getting started: pypreprocess 101

Simply cd to the examples/easy_start/ sub-directory and run the following command::

   $ python nipype_preproc_spm_auditory.py

If you find nipype errors like "could not configure SPM", this is most likely that the export of SPM_DIR and SPM_MCR (see above) have not been done in this shell.

Layout of examples

We have written some example scripts for preprocessing some popular datasets. The examples directory contains a set of scripts, each demoing an aspect of pypreprocessing. Some scripts even provide use-cases for the nipy-based GLM. The examples use publicly available sMRI and fMRI data. Data fetchers are based on the nilearn API. The main examples scripts can be summarized as follows:

Very easy examples

More advanced examples

Examples using pure Python (no SPM, FSL, etc. required)

Using .ini configuration files to specify pipeline

It is possible (and recommended) to configure the preprocessing pipeline just by copying any of the .ini configuration files under the examples sub-directory and modifying it (usually, you only need to modify the dataset_dir parameter), and then run::

  $ python pypreprocess.py your.ini

For example,::

  $ python pypreprocess.py examples/easy_start/spm_auditory_preproc.ini

Pipelines

We have put in place two main pipelines for preprocessing: the standard pipeline, and the DARTEL-based pipeline. In the end of either method, each subject's EPI data has been corrected for artefacts, and placed into the same reference space (MNI). When you invoke the do_subjects_preproc(..) API of nipype_preproc_spm_utils.py to preprocess a dataset (group of subjects), the default pipeline used is the standard one; passing the option do_dartel=True forces the DARTEL-based pipeline to be used. Also you can fine-tune your pipeline using the the various supported parameters in you .ini file (see the examples/ subdirectory for examples).

Standard pipeline

For each subject, the following preprocessing steps are undergone:

DARTEL pipeline

Motion correction, and coregistration go on as for the standard pipeline. The only difference between the DARTEL pipeline and the standard one is the way the subject EPI are warped into MNI space.

In the "Dartel pipeline", SPM's DARTEL is used to warp subject brains into MNI space.

Intra-subject preprocessing in pure Python (with no compiled code, etc.)

A couple of modules for intra-subject preprocessing (slice-timing correction, motion-correction, coregistration, etc.) in pure (only using builtins and numpy/scipy official stuff, no compiled code, no wrappers) Python have been implemented. To demo this feature, simply run the following command::

   $ python examples/pure_python/pure_python_preproc_demo.py

Development

You can check the latest version of the code with the command::

   $ git clone git://github.com/neurospin/pypreprocess.git

or if you have write privileges::

   $ git clone git@github.com:neurospin/pypreprocess.git