Open wvinzh opened 6 years ago
Have you solved this problem,please help me ,thanks!
Delete the Softmax layer, beacuse it is used in deploy.prototxt, and change bottom: "prob"to bottom: "fc2" in FocalLoss layer.
why ignore label: -1 ??
@wvinzh @lyj0823 @Jacky3213 also have this problem. And i try to change the num_output of fc layer from 2 to 1. It does work, but the acc is very low...just 60%... So, how do you solve it? Please help, Thanks!
I'm using Focal Loss to train a two-class model. my prototxt is like this: layer { bottom: "pool5" top: "fc2" name: "fc2" type: "InnerProduct" inner_product_param { num_output: 2 } } layer { bottom: "fc2" top: "prob" name: "prob" type: "Softmax" }
layer { name: "loss_cls" type: "FocalLoss" bottom: "prob" bottom: "label" propagate_down: 1 propagate_down: 0 top: "loss_cls" include { phase: TRAIN } loss_weight: 1 loss_param { ignore_label: -1 normalize: true } focal_loss_param { alpha: 0.5 gamma: 2 } }
but it went wrong :
bottom[0]->count() == bottom[1]->count() (16 vs. 8) SIGMOID_CROSS_ENTROPY_LOSS layer inputs must have the same count.
I think maybe this Focal loss is using simoid, so how can i use it with softmax??
thanks!