Open IDoCodingStuffs opened 2 months ago
Currently per-class weights are simply multiplied with the raw NLL loss, which skews the effective learning rate. They should be normalized by sum by default to avoid this:
https://discuss.pytorch.org/t/the-value-of-weights-for-weighted-cross-entropy-should-be-normalized/190727
Currently per-class weights are simply multiplied with the raw NLL loss, which skews the effective learning rate. They should be normalized by sum by default to avoid this:
https://discuss.pytorch.org/t/the-value-of-weights-for-weighted-cross-entropy-should-be-normalized/190727