Closed narduzzi closed 5 months ago
There is a function sinabs.utils.normalize_weights
which allows you to normalize weights based on a given percentile of output activity. It roughly follows this paper . You can find further details in the docs.
During training you can use sinabs.layers.NeuromorphicReLU
as activation function instead of the usual ReLU. It is a step function in the forward pass (approximating rate coding in an SNN) and uses a surrogate gradient in the backward pass.
You might regularize the expected number of synaptic operations, already during training. For more details to both see the docs and this paper.
Hello,
I have a quick question about the normalization of layers during the ANN-SNN conversion. Is there a good way to estimate the parameters for the conversion (percentile)?
Alternatively, is there a smart way to modify the parameters of the from_torch function, changing the spiking hyperparameters for better accuracy? Is there another trick during training to keep the accuracy after conversion?
Thanks!