synsense / sinabs

A deep learning library for spiking neural networks which is based on PyTorch, focuses on fast training and supports inference on neuromorphic hardware.
https://sinabs.readthedocs.io
GNU Affero General Public License v3.0
80 stars 8 forks source link

Conversion error mitigation #231

Closed narduzzi closed 5 months ago

narduzzi commented 8 months ago

Hello,

I have a quick question about the normalization of layers during the ANN-SNN conversion. Is there a good way to estimate the parameters for the conversion (percentile)?

Alternatively, is there a smart way to modify the parameters of the from_torch function, changing the spiking hyperparameters for better accuracy? Is there another trick during training to keep the accuracy after conversion?

Thanks!

bauerfe commented 8 months ago

There is a function sinabs.utils.normalize_weights which allows you to normalize weights based on a given percentile of output activity. It roughly follows this paper . You can find further details in the docs.

During training you can use sinabs.layers.NeuromorphicReLU as activation function instead of the usual ReLU. It is a step function in the forward pass (approximating rate coding in an SNN) and uses a surrogate gradient in the backward pass. You might regularize the expected number of synaptic operations, already during training. For more details to both see the docs and this paper.