Open thnkim opened 5 years ago
Hello, at the code below: https://github.com/TreB1eN/InsightFace_Pytorch/blob/b0e21f6345749130bc68e2a38ca9d92e4f7e9f4e/Learner.py#L203 , I guess the loss computed by nn.CrossEntropy() is already element-wise averaged. So div by conf.batch_size seems not to be necessary, right? Thank you.
yes, you are right, although this is only for logging, I have updated it
Hello, at the code below: https://github.com/TreB1eN/InsightFace_Pytorch/blob/b0e21f6345749130bc68e2a38ca9d92e4f7e9f4e/Learner.py#L203 , I guess the loss computed by nn.CrossEntropy() is already element-wise averaged. So div by conf.batch_size seems not to be necessary, right? Thank you.