CoinCheung / pytorch-loss

label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful
MIT License
2.17k stars 374 forks source link

should use log_probs in affinity loss #41

Open eliqbiq opened 1 month ago

eliqbiq commented 1 month ago

in AffinityFieldLoss, the KL divergence calculation should use log_probs instead of probs.

cemamxiaoxixi commented 1 month ago

您的邮件我已经收到,会尽快回复的。