pkuxmq / OTTT-SNN

[NeurIPS 2022] Online Training Through Time for Spiking Neural Networks
51 stars 10 forks source link

Why average pooling? #4

Open bjourne opened 3 weeks ago

bjourne commented 3 weeks ago

In spiking_vgg.py average pooling is used in favor of max pooling for the VGG's pooling layers. What is the reason for this deviation?

pkuxmq commented 3 weeks ago

Using average pooling in SNNs is a common practice, originating from early ANN-SNN conversion methods (e.g., [1]) where linear operations can be converted easily while non-linear operations like max-pooling are not theoretically guaranteed for conversion. At the same time, average pooling can be viewed as a linear operation and may be integrated into convolution after training as synaptic operations, potentially benefiting deployment on neuromorphic hardware. For direct training methods, it is also possible to use max pooling, e.g., [2].

[1] Going deeper in spiking neural networks: VGG and residual architectures. Frontiers in neuroscience, 2019. [2] Incorporating Learnable Membrane Time Constant to Enhance Learning of Spiking Neural Networks. ICCV 2021.