openspyrit / spyrit

A Python toolbox for deep image reconstruction, with emphasis on single-pixel imaging.
https://spyrit.readthedocs.io/en/master
GNU Lesser General Public License v3.0
39 stars 15 forks source link
deep-learning image-reconstruction medical-imaging pypi python single-pixel-camera torch

GitHub tag (latest by date) GitHub PyPI pyversions Docs

SPyRiT

SPyRiT is a PyTorch-based deep image reconstruction package primarily designed for single-pixel imaging.

Installation

The spyrit package is available for Linux, MacOs and Windows. We recommend to use a virtual environment.

Linux and MacOs

(user mode)

pip install spyrit

(developper mode)

git clone https://github.com/openspyrit/spyrit.git
cd spyrit
pip install -e .

Windows

On Windows you may need to install PyTorch first. It may also be necessary to run the following commands using administrator rights (e.g., starting your Python environment with administrator rights).

Adapt the two examples below to your configuration (see here for the latest instructions)

(CPU version using pip)

pip3 install torch torchvision torchaudio

(GPU version using conda)

conda install pytorch torchvision torchaudio pytorch-cuda=12.4 -c pytorch -c nvidia

Then, install SPyRiT using pip:

(user mode)

pip install spyrit

(developper mode)

git clone https://github.com/openspyrit/spyrit.git
cd spyrit
pip install -e .

Test

To check the installation, run in your python terminal:

import spyrit

Get started - Examples

To start, check the documentation tutorials. These tutorials must be runned from tutorial folder (they load image samples from spyrit/images/):

cd spyrit/tutorial/

More advanced reconstruction examples can be found in spyrit-examples/tutorial. Run advanced tutorial in colab: Open In Colab

API Documentation

https://spyrit.readthedocs.io/

Contributors (alphabetical order)

How to cite?

When using SPyRiT in scientific publications, please cite the following paper:

When using SPyRiT specifically for the denoised completion network, please cite the following paper:

License

This project is licensed under the LGPL-3.0 license - see the LICENSE.md file for details

Acknowledgments