Open Cappinator opened 5 years ago
Same. There are typos in activation functions.
For the first simple perceptron it should be like:
return 1.0 if activation >= 0.0 else 0.0
For RELU is should be:
return activation if activation > 0.0 else 0.0
Also the third input is [1.0,11.0,1.0] which is a typo or a put-up outlier.
Also found this when doing this exercise today. Another way of implementing ReLU follows
def ReLU(activation)
return max(activation, 0.0)
The output is:
Not at all converging as the book mentions, so there must be an error in the code.