bic-L / Masked-Spiking-Transformer

[ICCV-23] Masked Spiking Transformer
23 stars 1 forks source link

snn detail #7

Closed bonelll closed 4 months ago

bonelll commented 4 months ago

Hello, I would like to ask if there is any significance in using replace_activation_by_neuron in the snn_validata section of the code. Also, how is it related to replace_activation_by_floor in SNN? What is the function and meaning of snn_validata?

bonelll commented 4 months ago

"Why is there a distinction between TCL and MyFloor, and why TCL is adopted when t=0?"

bic-L commented 4 months ago

Hi, Great question! In our work, we use the ANN2SNN conversion method to obtain a high-performance SNN. We specifically employ the Myfloor[2] neuron equivalence (another name is QCFS), which is one of the two representative methods for converting ANNs to SNNs, with the other being TCL[1]. You can find more details about these techniques in the respective papers.

As you can see in the main,py, we have chosen to use Myfloor(QCFS) for our work.

[1]Nguyen-Dong Ho and Ik-Joon Chang. Tcl: an ann-to-snn conversion with trainable clipping layers. arXiv preprint arXiv:2008.04509, 2020

[2]Bu, Tong, et al. Optimal ANN-SNN conversion for high-accuracy and ultra-low-latency spiking neural networks. arXiv preprint arXiv:2303.04347, 2023; https://github.com/putshua/SNN_conversion_QCFS/tree/master

bonelll commented 4 months ago

Thank you for your response. I apologize for the interruption, but I have a few more questions. In the code, what is the significance of using "replace_activation_by_neuron" in the "snn_validata" section? Is there any correlation between this and the "replace_activation_by_floor" function in SNN? Additionally, what is the purpose and meaning of "snn_validata"?

bic-L commented 4 months ago

In ANN2SNN conversion, we train a proxy ANN model that more close to an ANN with quantized activation, rather than an SNN with binary activation. After training, we use the "replace_activation_by_floor" function to convert the proxy ANN into an SNN model. So, essentially, we train an ANN but can validate and make inferences as an SNN.

I think this figure can help explain the basic idea more clearly. he proxy ANN learns an appropriate firing threshold that helps align the SNN's firing rate with the ANN's activation, enabling a smooth conversion between these two kinds of models.

figure1

bonelll commented 4 months ago

Thanks

bic-L commented 4 months ago

I highly recommend reading this paper. It is a very representative work in this field.

Bu, Tong, et al. Optimal ANN-SNN conversion for high-accuracy and ultra-low-latency spiking neural networks. arXiv preprint arXiv:2303.04347, 2023; https://github.com/putshua/SNN_conversion_QCFS/tree/master