Closed dwazalwar closed 6 years ago
In intro to Neural networks Cross Entropy 1 towards end of lesson (check at 2.22), in right model instead of -log(0.2), it should be -log(0.7).
A correction note was provided a while ago. Thanks for reporting!
In intro to Neural networks Cross Entropy 1 towards end of lesson (check at 2.22), in right model instead of -log(0.2), it should be -log(0.7).