Hi, I read the source code for training and came across the cross-entropy loss implementation. Here's the code snippet:
elif self.costFunction == "cross_entropy":
epsilon = 1e-12 # prevent zero argument in logarithm or division
error = -(y * ncp.log(z + epsilon) + (1 - y) * ncp.log(1 - z + epsilon))
I find it fascinating that you're using binary cross-entropy loss for a multi-class classification problem. I'm curious if there's any particular reason or insight behind using it instead of the usual categorical cross-entropy loss. Thank you!
Hi, I read the source code for training and came across the cross-entropy loss implementation. Here's the code snippet:
I find it fascinating that you're using binary cross-entropy loss for a multi-class classification problem. I'm curious if there's any particular reason or insight behind using it instead of the usual categorical cross-entropy loss. Thank you!