Chapter 10: Introduction to Artificial Neural Networks with Keras
ANN's
-Artifical Neural Network inspired by neurons in our brain
Perceptrons
Based on TLU (threshold logic unit)
Composed of a single layer TLU of TLUs with each TLU connected to all the imputs
MLP (Multi Layer Perceptron)
Eliminates some of the limitations of perception by stacking multiple Perceptrons
composed of:
One passthrough input layer
one or more layers of TLU's (called hidden layers)
One final layer of TLU's
DNN (Deep Neural Network)
An ANN containing deep stack of hidden layers
Backpropogation
Gradient Decent computing gradients automatically (autodiff) in two passes (fwd, bkwd)
computes the error with every single model parameter
Epoch
Each time all observations have been sent through the network
Training neural networks involves multiple epochs
"For each training instance, the backpropogation algorithm first makes a prediction (forward-pass) and measures the error, then goes through each layer in reverse to measure error contribution from each connection (reverse pass), and finally tweaks the connection wright to reduce the error (Gradient Descent step)
Read chapter daily. Write brief summary and list any questions below