fangwei123456 / Spike-Element-Wise-ResNet

Deep Residual Learning in Spiking Neural Networks
Mozilla Public License 2.0
140 stars 21 forks source link

Question about sew block #13

Open mi804 opened 2 years ago

mi804 commented 2 years ago

Hello, during experiment, I found that the output of SEW block is not only 0 or 1 (i.e. pulse), but may appear positive integers, such as 2,3,4,5, etc. The discovery is indeed the meaning of the method itself, but these values will definitely be multiplied in the convolution calculation with the next layer. Does this violate the original intention of the spiking neural network, which computs information as pulse values? Will the efficiency of SEW ResNet be improved after it is implemented on the actual neural chip? May it be an ann instead of an snn?

fangwei123456 commented 2 years ago

Hello, during experiment, I found that the output of SEW block is not only 0 or 1 (i.e. pulse)

If you use ADD, it is normal to appear int values.

image

Will the efficiency of SEW ResNet be improved after it is implemented on the actual neural chip

It depends on the chip. Loihi2 supports for graded spikes (I guess it is 8-bit).

mi804 commented 2 years ago

Thank you for your answer. On such a chip, is the multiplication operation between the 8bit spike value and the weight acceptable? Can I take Sew Block as the baseline for structural design and compare it with other structures without considering the efficiency problem?

fangwei123456 commented 2 years ago

On such a chip, is the multiplication operation between the 8bit spike value and the weight acceptable?

I do not know much about it.

Can I take Sew Block as the baseline for structural design and compare it with other structures without considering the efficiency problem?

It depends on youself. Using ADD will get higher accuracy, but may cause criticism. You can use binary connect_f to avoid the non-binary outputs caused by ADD.

mi804 commented 2 years ago

Ok. Thanks a lot!