BICLab / Attention-SNN

Offical implementation of "Attention Spiking Neural Networks" (IEEE T-PAMI2023)
https://ieeexplore.ieee.org/abstract/document/10032591
MIT License
53 stars 7 forks source link

Dough about the Accuracy claimed about Attention SNN #4

Open A227902 opened 5 months ago

A227902 commented 5 months ago

The value of 'dt' as 15 and 'T 'as 60 at TABLE I for the data set DVS128 Gesture, and found the testing accuracy 89.9305% in the epoch 138, where your work in the paper says it has an efficiency of 96.53%. So any modification in the code is there that needs to be done to improve the testing accuracy to the claimed value.

A227902 commented 5 months ago

Thank you sir for your response, I also use CPU, will try to check for your 'dt'.

StCross commented 4 months ago

I have the same question. Got 90.1 acc when reproduction, can you provide the dt and T for a better acc?

oteomamo commented 4 months ago

I tested all A-SNNs for this paper, and my results are consistent with those in the paper. If you are achieving accuracy around 90%-92%, it's because you are running the training on a vanilla SNN without incorporating attention. To enable attention, you should specify the type of attention you want to use in the Config.py file of each dataset under the self.attention = "no" hyperparameter. You can set this to CA, TA, SA, CSA, TCA, TSA, TCSA, or no. The results shown in the paper use dt = 15 and T = 60 for the DVS128 Gesture dataset. The only difference in my results was that CSA gave the best test accuracy, with 96.32%. These are all GPU results however, cpu should is the same too, just longer training time.