Backpropagation, short for backward propagation of errors, is an important part of training feedforward networks and calculates the gradient/slope of the loss function in order to update the individual node weights.
Key Points
A loss function's calculation represents how far the network's predictions are from the true labels.
Begins from the last layer and goes to the first using every previous layer as data.
The output of a hidden layer is calculated by the activation function multiplied by the sum of bias and matrix and the product of the input voctor and the hidden weights matrix.
ActivationFunction(HiddenWeightMatrix*InputVector+Bias)
Title
What is Backpropagation?
URL
https://deepai.org/machine-learning-glossary-and-terms/backpropagation
Summary
Backpropagation, short for backward propagation of errors, is an important part of training feedforward networks and calculates the gradient/slope of the loss function in order to update the individual node weights.
Key Points
A loss function's calculation represents how far the network's predictions are from the true labels. Begins from the last layer and goes to the first using every previous layer as data. The output of a hidden layer is calculated by the activation function multiplied by the sum of bias and matrix and the product of the input voctor and the hidden weights matrix. ActivationFunction(HiddenWeightMatrix*InputVector+Bias)
Citation
Repo link