Closed mls1999725 closed 3 years ago
We appreciate your interests in MagFace. The negative loss seems ambiguous and we will clarify it more clearly in the next version. It is actually the negative value of the cross entropy loss, i.e., the log(softmax) value. Easy samples have large softmax values and so are log(softmax) values.
I get it, Thanks a lot!
Hi, I'm very interested in your work. I wonder what is negative loss in Figure 3 of MagFace paper? Is that the value of cross-entropy loss after softmax?