Closed I159 closed 6 years ago
Back propagation requires return values of each layer due to a common weights correction algorithm:
wij -= C(A(xij), d) * xij (For each neuron)
Where:
wij is synapse (weight) between layeri and layeri-1
C() is a cost function
A() is an activation function
d is a desired output value (fit value)
xij is a current input value for a node
Implement a "fit mode" for prediction algorithm to keep x00 ... xnn all return value vectors till back propagation be finished.
Inputs must be cached as 2D matrix immediately after multiplication of previous layer output by weights to keep each input of each neuron for backpropagation.
Motivation
Back propagation requires return values of each layer due to a common weights correction algorithm:
wij -= C(A(xij), d) * xij (For each neuron)
Where:
wij is synapse (weight) between layeri and layeri-1
C() is a cost function
A() is an activation function
d is a desired output value (fit value)
xij is a current input value for a node
Task
Implement a "fit mode" for prediction algorithm to keep x00 ... xnn all return value vectors till back propagation be finished.