Open maryamag85 opened 5 years ago
confusing definition for cross-entropy loss referring to Stanford lecture notes http://cs231n.github.io/linear-classify/
you are calling log loss same as cross entropy loss
You can see log loss as a special case (with only 2 classes) of entropy loss.
confusing definition for cross-entropy loss referring to Stanford lecture notes http://cs231n.github.io/linear-classify/
you are calling log loss same as cross entropy loss