jeshraghian / snntorch

Deep and online learning with spiking neural networks in Python
https://snntorch.readthedocs.io/en/latest/
MIT License
1.35k stars 222 forks source link

Reset of snn.Leaky(beta=0.8) is S(t)U_thr and not beta*S(t)*U_thr #341

Open alexbababu opened 3 months ago

alexbababu commented 3 months ago

Description

I was doing the snnTorch Tutorial Lesson 3 about LIF Neurons in Feed Forward Networks. In Chapter 2 a snn.Leaky-Neuron is created and its behaviour with constant current input is tested. In a plot the membran potential is plotted. It is explained, that a soft-reset mechanism is implemented. In contrast to the implementation of the Neuron in Chapter 1.4 the reset subtracts \beta*U_threshold. Therefore the membran potential should fall to ~0.2 and not to ~0.0 This does not happen for me. I tested on Windows, Linux and Google Colab. My colleague has a Mac and for him it works as explained and shown here: https://raw.githubusercontent.com/jeshraghian/snntorch/master/docs/_static/img/examples/tutorial3/_static/snn.leaky_step.png

What I Did

Ran all cells in snntorch_tutorial_3.ipynb

I get this snntorch_leaky_reset

I expect this: snn leaky_step

mem_rec
tensor([0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000,
        0.0000, 0.2100, 0.3780, 0.5124, 0.6199, 0.7059, 0.7747, 0.8298, 0.8738,
        0.9091, 0.9373, 0.9598, 0.9778, 0.9923, 1.0038, 0.0131, 0.2204, 0.3864,
        0.5191, 0.6253, 0.7102, 0.7782, 0.8325, 0.8760, 0.9108, 0.9387, 0.9609,
        0.9787, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104,
        0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044,
        0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761,
        0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867,
        0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610,
        0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104,
        0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044,
        0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761,
        0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867,
        0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610,
        0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104,
        0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044,
        0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761,
        0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867,
        0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610,
        0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104,
        0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044,
        0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761,
        0.9109, 0.9387])
alexbababu commented 3 months ago

My colleague on Mac was using snntorch=0.7.0 and torch 2.2.1. I installed a venv with same version and now I get the expected behaviour.

torch>=2.3.1 or snntorch=0.9.1 broke the softreset