bermanmaxim / LovaszSoftmax

Code for the Lovász-Softmax loss (CVPR 2018)
http://bmax.im/LovaszSoftmax
MIT License
1.38k stars 269 forks source link

Should I cancel the "sigmoid/softmax" layer in the last? #8

Closed Bonsen closed 6 years ago

Bonsen commented 6 years ago

Should I cancel the "sigmoid/softmax" layer in the last when training?

SorourMo commented 6 years ago

I have used the lovasz_hinge loss in a binary segmentation task in keras. The activation function of the last layer was sigmoid (with Adam optimizer). I didn't remove it and let it be there and used the new loss function. Overall it improved jaccard index on my own test set for 1% (increased 75% to 76%).

bermanmaxim commented 6 years ago

@Bonsen lovasz_softmax expects softmax-normalized inputs; lovasz_hinge expect unnormalized scores corresponding to the positive class.

rajexp commented 3 years ago

@Bonsen lovasz_softmax expects softmax-normalized inputs; lovasz_hinge expects unnormalized scores corresponding to the positive class.

The documentation on the contrary says lovasz_softmax takes sigmoid output. Interpreted as binary (sigmoid) output.