bermanmaxim / LovaszSoftmax

Code for the Lovász-Softmax loss (CVPR 2018)
http://bmax.im/LovaszSoftmax
MIT License
1.38k stars 269 forks source link

lovasz_hinge as loss function Error #27

Open AmericaBG opened 5 years ago

AmericaBG commented 5 years ago

Hi!! Firstly, thank you very much for sharing your code. It's a great work!

I'm trying to train my model with lovasz_hinge as loss function:

model.compile(optimizer =opt,loss= [lovasz_hinge], metrics = [matthews_correlation])

But I have the next error:

File "C:\Users\Usuario\Anaconda3\envs\env_gpu\lib\site-packages\keras\optimizers.py", line 91, in get_gradients raise ValueError('An operation has None for gradient. '

ValueError: An operation has None for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval

Do you know what it's the problem?

Thank you very much in advance!

Frost-Lee commented 4 years ago

According to this comment, simply swap the argument position of labels and logits will help.

def lovasz_hinge(labels, logits, per_image=True, ignore=None):
    # No need to change the implementation of the function