BindsNET / bindsnet

Simulation of spiking neural networks (SNNs) using PyTorch.
GNU Affero General Public License v3.0
1.46k stars 331 forks source link

How backpropagation work? #656

Closed Whale-xh closed 10 months ago

Whale-xh commented 10 months ago

Could you please tell me where the backpropagation is reflected in the supervise_mnist.py in the example? I couldn’t find it.

Hananel-Hazan commented 10 months ago

This question is more relevant to the discussion section, as you have not reported any issues with BindsNET code. Backpropagation is not used in supervise_mnist.py because the algorithm uses guided STDP to train the network, changing the weights accordingly.

Whale-xh commented 10 months ago

Hello, my input data is a two-dimensional matrix. Why doesn’t the loss value change significantly when I use BindsNet for regression problem research? What should I pay attention to in order to solve this problem? Thank you!

rafaelblevin821 commented 10 months ago

Hello, my input data is a two-dimensional matrix. Why doesn’t the loss value change significantly when I use BindsNet for regression problem research? What should I pay attention to in order to solve this problem? Thank you!

You are talking about backpropagation, loss, and regression. None of these are used in the supervised_mnist script.

Hananel-Hazan commented 10 months ago

Using the supervise_mnist.py script, you should pay attention to how the input is been encoded, what is the strength of the teaching signal and how much spiking activity exists in all the three layers.

Whale-xh commented 10 months ago

Hello, may I ask What are the functions of the variables' choice ',' clamp 'in the supervised_ mnist. py file, and' clamp 'in the run statement? What is the purpose of this setting?

Hananel-Hazan commented 10 months ago

Those features used to guide (supervised) the training process to depress or straighten the right weights