nitin-rathi / hybrid-snn-conversion

Training spiking networks with hybrid ann-snn conversion and spike-based backpropagation
https://openreview.net/forum?id=B1xSperKvH
94 stars 24 forks source link

about the activation function #7

Open CHonChou opened 3 years ago

CHonChou commented 3 years ago

Is it possible to replace the activation function RELU in snn.py with Prelu

CHonChou commented 3 years ago

In other words, I am talking about the activation function in vgg_spiking.py

nitin-rathi commented 3 years ago

The activation function in vgg_spiking.py is a place holder. The actual activation is integrate-and-fire (IF) implemented in the LinearSpike/STDB class

CHonChou commented 3 years ago

If I change the activation function used by ann from relu to Prelu and avgpool2d to maxpool2d, what should I change in vgg_spiking.py accordingly

CHonChou commented 3 years ago

If I apply the batch_nromalization method in ann, do I need to make corresponding changes in snn

nitin-rathi commented 3 years ago

This ANN-SNN conversion method only works for ReLU, average pooling, and dropout. If you plan to include batch norm, you may need to define additional blocks for SNN.

CHonChou commented 3 years ago

thank you a lot for your replay,i'll try to figure out this.