I've added a small section on activation functions. Included the idea of a neuron being active or inactive. Included two sigmoidal functions (sigmoid and tanh) and then two examples of ReLU functions. Mentioned the idea of neural network saturation and how this can slow learning rate.
I also updated two placeholder pictures.
All figures are self made so there are no copyright problems.
I've added a small section on activation functions. Included the idea of a neuron being active or inactive. Included two sigmoidal functions (sigmoid and tanh) and then two examples of ReLU functions. Mentioned the idea of neural network saturation and how this can slow learning rate.
I also updated two placeholder pictures.
All figures are self made so there are no copyright problems.