Closed Whale-xh closed 10 months ago
This question is more relevant to the discussion section, as you have not reported any issues with BindsNET code. Backpropagation is not used in supervise_mnist.py
because the algorithm uses guided STDP to train the network, changing the weights accordingly.
Hello, my input data is a two-dimensional matrix. Why doesn’t the loss value change significantly when I use BindsNet for regression problem research? What should I pay attention to in order to solve this problem? Thank you!
Hello, my input data is a two-dimensional matrix. Why doesn’t the loss value change significantly when I use BindsNet for regression problem research? What should I pay attention to in order to solve this problem? Thank you!
You are talking about backpropagation, loss, and regression. None of these are used in the supervised_mnist script.
Using the supervise_mnist.py
script, you should pay attention to how the input is been encoded, what is the strength of the teaching signal and how much spiking activity exists in all the three layers.
Hello, may I ask What are the functions of the variables' choice ',' clamp 'in the supervised_ mnist. py file, and' clamp 'in the run statement? What is the purpose of this setting?
Those features used to guide (supervised) the training process to depress or straighten the right weights
Could you please tell me where the backpropagation is reflected in the supervise_mnist.py in the example? I couldn’t find it.