sandialabs / cross-sim

CrossSim: accuracy simulation of analog in-memory computing
Other
103 stars 24 forks source link

About the cross-entropy loss. #13

Open aa1234241 opened 1 year ago

aa1234241 commented 1 year ago

Hi, I read the source code for training and came across the cross-entropy loss implementation. Here's the code snippet:

 elif self.costFunction == "cross_entropy":
       epsilon = 1e-12  # prevent zero argument in logarithm or division
       error = -(y * ncp.log(z + epsilon) + (1 - y) * ncp.log(1 - z + epsilon))

I find it fascinating that you're using binary cross-entropy loss for a multi-class classification problem. I'm curious if there's any particular reason or insight behind using it instead of the usual categorical cross-entropy loss. Thank you!