iCGY96 / syops-counter

SyOPs counter for spiking neural networks
MIT License
47 stars 7 forks source link
pytorch spiking-neural-networks syops-counter

Synaptic OPerations (SyOPs) counter for spiking neural networks

Pypi version

This script is designed to compute the theoretical amount of synaptic operations in spiking neural networks, including accumulated (AC) and multiply-accumulate (MAC) operations. It can also compute the number of parameters and print per-layer computational cost of a given network. This tool is still under construction. Comments, issues, contributions, and collaborations are all welcomed!

Supported layers:

Experimental support:

Requirements: Pytorch >= 1.1, torchvision >= 0.3, spikingjelly<=0.0.0.0.12

Usage

Install the latest version

From PyPI:

pip install syops

From this repository:

pip install --upgrade git+https://github.com/iCGY96/syops-counter

Example

import torch
from spikingjelly.activation_based import surrogate, neuron, functional
from spikingjelly.activation_based.model import spiking_resnet
from syops import get_model_complexity_info

dataloader = ...
with torch.cuda.device(0):
    net = spiking_resnet.spiking_resnet18(pretrained=True, spiking_neuron=neuron.IFNode, 
            surrogate_function=surrogate.ATan(), detach_reset=True)
    ops, params = get_model_complexity_info(net, (3, 224, 224), dataloader, as_strings=True,
                                            print_per_layer_stat=True, verbose=True)
    print('{:<30}  {:<8}'.format('Computational complexity ACs:', acs))
    print('{:<30}  {:<8}'.format('Computational complexity MACs:', macs))
    print('{:<30}  {:<8}'.format('Number of parameters: ', params))

Benchmark

Model Input Resolution Params(M) ACs(G) MACs(G) Energy (mJ) Acc@1 Acc@5
spiking_resnet18 224x224 11.69 0.10 0.14 0.734 62.32 84.05
sew_resnet18 224x224 11.69 0.50 2.75 13.10 63.18 84.53
DSNN18 (AAP) 224x224 11.69 1.69 0.20 2.44 63.46 85.14
resnet18 224x224 11.69 0.00 1.82 8.372 69.76 89.08

Citation

If you find our work useful for your repo, please consider giving a star :star: and citation :beer::

@article{chen2023training,
  title={Training Full Spike Neural Networks via Auxiliary Accumulation Pathway},
  author={Chen, Guangyao and Peng, Peixi and Li, Guoqi and Tian, Yonghong},
  journal={arXiv preprint arXiv:2301.11929},
  year={2023}
}

Acknowledgements

This repository is developed based on ptflops