NeuromorphicProcessorProject / snn_toolbox

Toolbox for converting analog to spiking neural networks (ANN to SNN), and running them in a spiking neuron simulator.
MIT License
360 stars 104 forks source link

Poor simulation accuracy when converted to SNN #115

Closed Mamoru597 closed 2 years ago

Mamoru597 commented 2 years ago

First of all, thank you very much for creating the snntoolbox. I am trying to convert a CNN (ResNet18) that classifies the Cifar10 dataset into an SNN. The accuracy of the input model and the parsed model are almost similar, but the accuracy after converting to SNN model is not good. I hope you can help me with the results, settings and architecture below. I will attach a photo of the model architecture.

input model top-1 accuracy 75% top-5 accuracy 96% parsed model top-1 accuracy 74% top-5 accuracy 76% SNN model total accuracy 11% accuracy averaged by class size 10%

Is it a problem that the accuracy is not completely consistent with the input model?

[paths] path_wd = path_wd dataset_path = path_wd filename_ann = cifar10_cnn [tools] evaluate_ann = True normalize = True [simulation] simulator = INI duration = 256 dt = 1 num_to_test = 100 batch_size = 50 keras_backend = tensorflow [conversion] softmax_to_relu = True [output] plot_vars = {'error_t', 'activations', 'spiketrains', 'correlation', 'v_mem', 'spikerates'}

I changed a few settings, such as extending the simulation duration and setting normalization to True, but the accuracy remained poor.

Then, I plotted the correlation between SNN and ANN using the GUI, but I am not sure what I should check. I apologize for the rudimentary question, but could you please tell me? I am also attaching some figures of the correlation.

model_architecture.zip Correlation.zip

rbodo commented 2 years ago

Hi, thanks for the detailed question.

The reason for the accuracy drop is at least partially due to the missing activation layers in your input model. The non-leaky integrate-and-fire neurons in our SNN inherently apply a ReLU nonlinearity. So when some of your ANN layers do not use a nonlinearity, it will create a mismatch after conversion. This problem occurs in most off-the-shelf ResNet implementations. Check e.g. your BatchNorm layers just before the concatenate. The BatchNorm should be followed by a ReLU (which might mean you have to do some retraining). As a side note, make sure ReLU comes after and not before BatchNorm, because the BatchNorm layer will be fused with the preceeding Conv layer during parsing.

There is a ResNet example here which might be helpful.

Mamoru597 commented 2 years ago

Thank you for your reply.

I tried what you advised. I saw a change, but the accuracy remained low. I would like to look at the example you provided and find the difference.

Sorry, I have one more question. When I placed Activation after BatchNorm, the accuracy of SNN still remained around 10%. Is this a problem in the analysis of the model? Is there a problem with the simulation settings?

rbodo commented 2 years ago

Please reopen if the ResNet example script didn't help solve the issue.