noahbenson / neuropythy

A neuroscience library for Python, intended to complement the existing nibabel library.
GNU Affero General Public License v3.0
115 stars 21 forks source link

TravisCI PyPI version

neuropythy

A neuroscience library for Python, intended to complement the existing nibabel library.

For additional documentation, in particular usage documentation, see the neuropythy wiki and the OSF wiki for Benson and Winawer, (2018).

Author

Noah C. Benson <nben@uw.edu>

Installation

The neuropythy library is available on PyPI and can be installed via pip:

pip install neuropythy

The dependencies (below) should be installed auotmatically. Alternately, you can check out this github repository and run setuptools:

# Clone the repository
git clone https://github.com/noahbenson/neuropythy
# Enter the repo directory
cd neuropythy
# setup the submodules
git submodule init && git submodule update
# Install the library
python setup.py install

Dependencies

The neuropythy library depends on a few other libraries, all freely available:

These libaries should be installed automatically for you if you use pip or setuptools (see above), and they must be found on your PYTHONPATH in order to use neuropythy.

Optional Dependencies

All optional dependencies are included in the requirements-dev.txt file in the neuropythy repository root.

Python Version

Neuropythy is compatible with both Python 2 and 3. It was deleveloped under 2.7 and is now used primarily with 3.6.

Configuration

Neuropythy is most useful when it knows where to find your FreeSurfer subject data or where you want it to store datasets or Human Connectome Project files. These configuration items can be set in a number of ways:

Human Connectome Project Integration

The neuropythy library is capable of automatically integrating with the Human Connectome Project's Amazon S3 bucket. Neuropythy will present you with nested data structures representing individual HCP subjects and will silently download the relevant structure files as they are requested. To configure this behavior, follow these steps:

Note that the above steps will additionally enable auto-downloading of the retinotopic mapping database; if you are only interested in the structural data, you can set the "hcp_auto_download" variable to "structure". If you do enable auto-downloading of the retinotopic maps, then the first time you examine an HCP subject, neuropythy will have to download the retinotopy database files, which are approximately 1 GB; it may appear as if neuropythy has frozen during this time, but it is probably just due to the download. Generally speaking, if your internet connection is relatively fast, you should not notice significant delays from downloading the HCP strucutral data otherwise.

For more information about using the HCP module of neuropythy, see this page.

Additional notes:

Builtin Datasets

Neuropythy now comes with support for builtin datasets. These datasets are downloaded when they are first requested, and are only re-downloaded if necessary; note that if you have configured neuropythy's "data_cache_root" configuration variable (see Configuration, above), then the data will be downloaded to a temporary directory that is deleted when Python exits.

Currently, there is only one builtin dataset (not including the Human Connectome Project dataset, above), and that is the dataset from Benson and Winawer (2018). To access this dataset:

import neuropythy as ny
subs = ny.data['benson_winawer_2018'].subjects
sorted(subs.keys())
#=> ['S1201', 'S1202', 'S1203', 'S1204', 'S1205', 'S1206', 'S1207', 'S1208', 'fsaverage']
subs['S1201']
#=> Subject(<S1201>,
#=>         <'/Users/nben/Temp/npythy_cache/benson_winawer_2018/freesurfer_subjects/S1201'>)
subs['S1201'].lh.prop('prf_polar_angle')
#=> array([118.811386, 118.80122 , 120.842255, ..., -14.08387 , -62.615746, -32.82376],
#=>       dtype=float32)

See also help(ny.data['benson_winawer_2018']) or print(ny.data['benson_winawer_2018'].__doc__).

Commands

Currently Neuropythy is undergoing rapid development, but to get started, the neuropythy.commands package contains functions that run command-interfaces for the various routines included. Any of these commands may be invoked by calling Neuropythy's main function and passing the name of the command as the first argument followed by any additional command arguments. The argument --help may be passed for further information about each command.

If neuropythy is installed on your machine, then you can execute a command like so:

> python -m neuropythy surface_to_image --help
> python -m neuropythy atlas --verbose bert

Docker

There is a Docker containing Neuropythy that can be used to run the Neuropythy commands quite easily without installing Neuropythy itself. If you have Docker installed, you can use Neuropythy as follows:

# If your FreeSurfer subject's directory is /data/subjects and you want to
# apply the Benson2014 template to a subject bert:
docker run -ti --rm -v /data/subjects:/subjects nben/neuropythy \
           atlas --verbose bert

The docker can now also be used to start a notebook server; you can either build this yourself (in which case any local changes to the neuropythy code will be included) using docker-compose or you may use the nben/neuropythy docker on docker-hub.

Using docker-compose

To build the docker image locally:

git clone https://github.com/noahbenson/neuropythy
cd neuropythy
# This command will take some time to build the VM;
docker-compose build
# This will start the notebook server (and will build
# the docker first if you haven't run the above
# command). Note, however, that this command won't
# rebuild the container if you have local changes.
docker-compose up

The above instructions will create a notebook server running on port 8888; to change this, you can either edit the docker-compose.yml file or instead use docker-compose run:

docker-compose run -p 8888:8080 neuropythy notebook

Assuming that your FreeSurfer subjects directory and your HCP subject directory, if any, are set via the SUBJECTS_DIR and HCP_SUBJECTS_DIR environment variables, then these directories will be available inside the docker VM in /data/freesurfer_subjects and /data/hcp/subjects. Additionally, your NPYTHY_DATA_CACHE_ROOT, HCP_CREDENTIALS, HCP_AUTO_DOWNLOAD and other environment variables will be forwarded to neuropythy. However, note that if your HCP_CREDENTIALS variable is a file, you will need to put the literal credentials in the variable instead as the docker image cannot read from your file.

Using nben/neuropythy from Docker Hub

To run the notebook server using the prepared docker-image:

# fetch the docker:
docker pull nben/neuropythy:latest
# run the notebook server
docker run -it \
           -v "$SUBJECTS_DIR:/data/freesurfer_subjects" \
           -v "$HCP_SUBJECTS_DIR:/data/hcp/subjects" \
           -p 8888:8888 \
       nben/neuropythy notebook

Note that the lines starting with -v can each be omitted if you don't want to mount your subject directories inside the docker and/or if you don't have HCP/FreeSurfer subjects.

Citing

To cite Neuropythy, please reference the following:

References

A Note about Licenses

Versions of neuropythy prior to version 0.9.5 have all employed the GPLv3 license. Starting with version 0.9.5, however, neuropythy uses the Affero GPL (AGPL) license. This license requires that anyone providing a service that runs neuropythy over a network provide the source code to the version of neuropythy that they are running.

To be clear: this license does not interfere with any scientific or personal use of neuropythy, nor does it forbid commercial use outright. If you wish to use neuropythy in commercial software, you must only do one of the following:

License

This README file is part of the Neuropythy library.

This program is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public License for more details.

You should have received a copy of the GNU Affero General Public License along with this program. If not, see https://www.gnu.org/licenses/