unsky / focal-loss

Focal loss for Dense Object Detection
483 stars 124 forks source link

loss_ = -1 * np.power(1 - pro_, self._gamma) * np.log(pro_) is wrong #14

Open Sunting78 opened 6 years ago

Sunting78 commented 6 years ago

loss = -1 * np.power(1 - pro, self.gamma) * np.log(pro) is wrong . The pro_ should be pt, right?

unsky commented 6 years ago

i do not understand your mean, the codes in metric? the loss metric just a display and dont affact the model.

if your loss is in the metric.py pro and pt all will be ok. pro_ is to metric the total examples loss, but pt is to metric the positive examples loss.

Sunting78 commented 6 years ago

Yes,Iknow. In focal loss.py loss = -1 * np.power(1 - pro, self.gamma) * np.log(pro) .I know the loss metric just a display and dont affact the model. But I think the softmaxloss = -log(pt) pt is the positive examples softmax pro.

unsky commented 6 years ago

no. in standard softmax with ce loss is to metric total examples loss.

Sunting78 commented 6 years ago

I think softmaxloss = -sum(yilog(pi)). But label is one hot. So softmaxloss = -log(pt)

unsky commented 6 years ago

YES. -1ylog(pro_)=-log(pt), standard softmax with ce loss is to metric positive loss. but in general,i like display the total loss and positive loss,these loss can help us to adjust the params setting. you are right in one-hot label.