BICLab / Attention-SNN

Offical implementation of "Attention Spiking Neural Networks" (IEEE T-PAMI2023)
https://ieeexplore.ieee.org/abstract/document/10032591
MIT License
62 stars 9 forks source link

The question about the accuracy of the ImageNet dataset. #5

Open OuYangLiang0509 opened 7 months ago

OuYangLiang0509 commented 7 months ago

Hello author, In your paper, the ImageNet dataset achieves an impressive accuracy of 75.92% with T=1 and ResNet-104. However, I am only able to achieve 70.56% accuracy using parameters consistent with your supplementary materials. Furthermore, when examining the events file you provided in TensorBoard, I noticed that the test set accuracy is 74.14% while the training set accuracy is only 64.25%. This discrepancy shows that the test set accuracy is 10% higher than the training set accuracy. I look forward to your response.

oteomamo commented 6 months ago

Are you running the tests on vanilla or attention based SNN? To enable attention, you should specify the type of attention you want to use in the Config.py file of each dataset under the self.attention = "no" hyperparameter. You can set this to CA, TA, SA, CSA, TCA, TSA, TCSA, or no. After this you should get results within the margin of error. However one 'problem' I faced with with paper was the clip hyperparameter. I ran all my tests with clip as 1 as got the same results as the ones in the paper. Are you running your code using clip 1 too?