iamtrask / Grokking-Deep-Learning

this repository accompanies the book "Grokking Deep Learning"
7.37k stars 1.57k forks source link

Activate layer 2 output using Relu()? #15

Open michaelmegliola opened 5 years ago

michaelmegliola commented 5 years ago

I might be reading it incorrectly, but it looks like you don't apply the activation function to the final output layer? (should that be applied, in this context?)

naruto678 commented 5 years ago

You do not need to do that in your context .If you are predicting the probabilities then maybe you can apply a softmax layer to layer_2 as trask shows in the later chapters