faustomilletari / VNet

GNU General Public License v3.0
284 stars 123 forks source link

How to change the loss? #28

Closed sagarhukkire closed 7 years ago

sagarhukkire commented 7 years ago

hi @faustomilletari

I see Vnet has been tested on softmax with loss ,if i want to use it instead of dice for testing then which changes I have to consider?

Thanks

faustomilletari commented 7 years ago

Just use the documentation of caffe to decide which loss you want to use. also you can write your own in python taking as inspiration the dice loss file.

On Apr 12, 2017, at 6:49 PM, Sagar Hukkire notifications@github.com wrote:

hi @faustomilletari https://github.com/faustomilletari I see Vnet has been tested on softmax with loss ,if i want to use it instead of dice for testing then which changes I have to consider?

Thanks

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/faustomilletari/VNet/issues/28, or mute the thread https://github.com/notifications/unsubscribe-auth/AMtsvgFp9Vxs3fRr5049FovF17zry_5gks5rvVT8gaJpZM4M8EAv.

sagarhukkire commented 7 years ago

i just want to try softmax with loss, I went through doc , so if I comment dice loss in prototxt and write softmax with loss ,it will work? as it is provided by caffe itself. What your input

faustomilletari commented 7 years ago

I changed that to use weights, so maybe you will need to provide also weights. have a look also at 3d caffe softmax with loss layer, try to look for bugs there.

On Apr 12, 2017, at 7:06 PM, Sagar Hukkire notifications@github.com wrote:

i just want to try softmax with loss, I went through doc , so if I comment dice loss in prototxt and write softmax with loss ,it will work? as it is provided by caffe itself. What your input

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/faustomilletari/VNet/issues/28#issuecomment-293732075, or mute the thread https://github.com/notifications/unsubscribe-auth/AMtsvo6Xhu7KDwsv9E8iJCniibjpJYCxks5rvVkBgaJpZM4M8EAv.

sagarhukkire commented 7 years ago

hi @faustomilletari

I was just checking prototxt in Hough-CNN ,and comparing your comment in Vnet.py

 #solver.net.blobs['labelWeight'].data[...] = batchWeight.astype(dtype=np.float32)
            #use only if you do softmax with loss

then it means for softmax with loss layer in Prototxt, there will be three bottom ? as you have in hough cnn? I do not get this , or can I continue with two bottom ( pred,label) and one top (loss)

layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "pred"
  bottom: "label"
  bottom: "weight"
  top: "loss"
}
sagarhukkire commented 7 years ago

@faustomilletari By the way great job man !! you added sample weight which is not available in normal caffe. Hope they will add it After so much findings ,why there are three bottom I found out following blog

http://deepdish.io/2014/11/04/caffe-with-weighted-samples/

I guess that's reason you added three bottom in SoftmaxWithLoss