zimenglan-sysu-512 / Focal-Loss

loss layer of implementation
125 stars 41 forks source link

How to define focal loss in train_val.prototxt, two classes with softmax? #12

Open wvinzh opened 6 years ago

wvinzh commented 6 years ago

I'm using Focal Loss to train a two-class model. my prototxt is like this: layer { bottom: "pool5" top: "fc2" name: "fc2" type: "InnerProduct" inner_product_param { num_output: 2 } } layer { bottom: "fc2" top: "prob" name: "prob" type: "Softmax" }

layer { name: "loss_cls" type: "FocalLoss" bottom: "prob" bottom: "label" propagate_down: 1 propagate_down: 0 top: "loss_cls" include { phase: TRAIN } loss_weight: 1 loss_param { ignore_label: -1 normalize: true } focal_loss_param { alpha: 0.5 gamma: 2 } }

but it went wrong :
bottom[0]->count() == bottom[1]->count() (16 vs. 8) SIGMOID_CROSS_ENTROPY_LOSS layer inputs must have the same count.

I think maybe this Focal loss is using simoid, so how can i use it with softmax??

thanks!

lyj0823 commented 6 years ago

Have you solved this problem,please help me ,thanks!

JackyWang-001 commented 5 years ago

Delete the Softmax layer, beacuse it is used in deploy.prototxt, and change bottom: "prob"to bottom: "fc2" in FocalLoss layer.

ngunauj commented 5 years ago

why ignore label: -1 ??

1343464520 commented 4 years ago

@wvinzh @lyj0823 @Jacky3213 also have this problem. And i try to change the num_output of fc layer from 2 to 1. It does work, but the acc is very low...just 60%... So, how do you solve it? Please help, Thanks!