Closed Bonsen closed 6 years ago
I have used the lovasz_hinge loss in a binary segmentation task in keras. The activation function of the last layer was sigmoid (with Adam optimizer). I didn't remove it and let it be there and used the new loss function. Overall it improved jaccard index on my own test set for 1% (increased 75% to 76%).
@Bonsen lovasz_softmax expects softmax-normalized inputs; lovasz_hinge expect unnormalized scores corresponding to the positive class.
@Bonsen lovasz_softmax expects softmax-normalized inputs; lovasz_hinge expects unnormalized scores corresponding to the positive class.
The documentation on the contrary says lovasz_softmax takes sigmoid output. Interpreted as binary (sigmoid) output.
Should I cancel the "sigmoid/softmax" layer in the last when training?