Closed huhuh1234567 closed 2 years ago
In class djl/api/src/main/java/ai/djl/training/loss/SigmoidBinaryCrossEntropyLoss, when set fromSigmoid to true, the calculation would be wrong. It is at line 77 that a log operation is missing after NDArrays.sub(1., pred).add(eps)
The loss should be positive.
The loss may be negative.
Thanks for pointing this out @huhuh1234567! Would you be interested in submitting a PR to fix this?
Description
In class djl/api/src/main/java/ai/djl/training/loss/SigmoidBinaryCrossEntropyLoss, when set fromSigmoid to true, the calculation would be wrong. It is at line 77 that a log operation is missing after NDArrays.sub(1., pred).add(eps)
Expected Behavior
The loss should be positive.
Error Message
The loss may be negative.
How to Reproduce?
Steps to reproduce
What have you tried to solve it?
Environment Info