bermanmaxim / LovaszSoftmax

Code for the Lovász-Softmax loss (CVPR 2018)
http://bmax.im/LovaszSoftmax
MIT License
1.38k stars 269 forks source link

Labels should be only {-1,1} in case of binary segmentation? #10

Closed SorourMo closed 6 years ago

SorourMo commented 6 years ago

Hi there,

Thanks for sharing the code of this fantastic job. Congratulations on your CVPR paper! I have a question about the labels in the ground-truths. The gt labels should be {-1,1} (-1:background, 1:foreground) and for instance {0,1} (0:background, 1:foreground) doesn't work properly, right?

bermanmaxim commented 6 years ago

Hi, thanks! For the binary version it expects 0/1 masks, as seen e.g. in this code comment.

SorourMo commented 6 years ago

Perfect. Thanks

Kang9779 commented 6 years ago

Hi there, Tanks for sharing the code of this fantastis job. I meet the error using the lovasz_hinge function. Do you know what's wrong with it ? An operation has None for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval.

bermanmaxim commented 6 years ago

@kangzhang0709 what framework are you using?

bermanmaxim commented 6 years ago

@kangzhang0709 closing this issue, please open another one with more details about your problem (e.g. framework used)