Open Sunting78 opened 6 years ago
i do not understand your mean, the codes in metric? the loss metric just a display and dont affact the model.
if your loss is in the metric.py pro and pt all will be ok. pro_ is to metric the total examples loss, but pt is to metric the positive examples loss.
Yes,Iknow. In focal loss.py loss = -1 * np.power(1 - pro, self.gamma) * np.log(pro) .I know the loss metric just a display and dont affact the model. But I think the softmaxloss = -log(pt) pt is the positive examples softmax pro.
no. in standard softmax with ce loss is to metric total examples loss.
I think softmaxloss = -sum(yilog(pi)). But label is one hot. So softmaxloss = -log(pt)
YES. -1ylog(pro_)=-log(pt), standard softmax with ce loss is to metric positive loss. but in general,i like display the total loss and positive loss,these loss can help us to adjust the params setting. you are right in one-hot label.
loss = -1 * np.power(1 - pro, self.gamma) * np.log(pro) is wrong . The pro_ should be pt, right?