Closed KevinTang29 closed 3 years ago
@KevinTang29 Maybe Focal loss is an available method to solve the imbalance between the background class and other classes in training. Please have a try, and use "SigmoidFocalLoss" instead of "CrossEntropy" for multi-classes in the LI_Fusion_with_attention_use_ce_loss.yaml. Good luck to you!
@happinesslz Thank you very much for your advice. It seems that the original "SigmoidFocalLoss" can only be applied in binary classification, and I tried to change it to focal loss under multiple classification and trained with the focal loss. However, the problem still exists. Maybe the parameter alpha and gamma need to be tuned under multiple classification? In addition, I tried to increase the cross entropy weights of the object classes https://github.com/happinesslz/EPNet/blob/0123c341243846aa3b412addcb9e2c07fd305237/tools/cfgs/LI_Fusion_with_attention_use_ce_loss.yaml#L130 to [1.0, 2.0, 2.0, 2.0], and also decrease the roi number in training https://github.com/happinesslz/EPNet/blob/0123c341243846aa3b412addcb9e2c07fd305237/tools/cfgs/LI_Fusion_with_attention_use_ce_loss.yaml#L136 to 32 and trained with CE loss again, and the result is better. (though still not right, something like the classfication score changed from [0.9, 0.08, 0.01, 0.01] to [0.7, 0.25, 0.03, 0.02] if the roi is actually of class 1). Is this another possible way to solve the problem?
Did you make it work for multiclass-Epnet?
Did you make it work for multiclass-Epnet?
I just trained the model with CE loss with class weights. I set the model to detect four classes ['Car', 'Pedestrian', 'Cyclist', 'Truck'] and set the corresponding weights to be [1.0, 3.0, 6.0, 6.0, 12.0] (the first weight is of background class). The evaluation results seemed acceptable, though not good enough. Maybe the weights and other parameters need to be tuned further.
I have finished , can you show your mail to me ?and we can talk deeper ------------------ 原始邮件 ------------------ 发件人: "happinesslz/EPNet" <notifications@github.com>; 发送时间: 2021年1月28日(星期四) 下午4:03 收件人: "happinesslz/EPNet"<EPNet@noreply.github.com>; 抄送: "何必歇斯底里"<1481706330@qq.com>;"Comment"<comment@noreply.github.com>; 主题: Re: [happinesslz/EPNet] Detecting multiple classes (#8)
Did you make it work for multiclass-Epnet?
I just trained the model with CE loss with class weights. I set the model to detect four classes ['Car', 'Pedestrian', 'Cyclist', 'Truck'] and set the corresponding weights to be [1.0, 3.0, 6.0, 6.0, 12.0] (the first weight is of background class). The evaluation results seemed acceptable, though not good enough. Maybe the weights and other parameters need to be tuned further.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
I have finished , can you show your mail to me ?and we can talk deeper ------------------ 原始邮件 ------------------ 发件人: "happinesslz/EPNet" <notifications@github.com>; 发送时间: 2021年1月28日(星期四) 下午4:03 收件人: "happinesslz/EPNet"<EPNet@noreply.github.com>; 抄送: "何必歇斯底里"<1481706330@qq.com>;"Comment"<comment@noreply.github.com>; 主题: Re: [happinesslz/EPNet] Detecting multiple classes (#8) Did you make it work for multiclass-Epnet? I just trained the model with CE loss with class weights. I set the model to detect four classes ['Car', 'Pedestrian', 'Cyclist', 'Truck'] and set the corresponding weights to be [1.0, 3.0, 6.0, 6.0, 12.0] (the first weight is of background class). The evaluation results seemed acceptable, though not good enough. Maybe the weights and other parameters need to be tuned further. — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
I would like to know how you did. My Email is kevintang29@163.com
I have finished , can you show your mail to me ?and we can talk deeper ------------------ 原始邮件 ------------------ 发件人: "happinesslz/EPNet" <notifications@github.com>; 发送时间: 2021年1月28日(星期四) 下午4:03 收件人: "happinesslz/EPNet"<EPNet@noreply.github.com>; 抄送: "何必歇斯底里"<1481706330@qq.com>;"Comment"<comment@noreply.github.com>; 主题: Re: [happinesslz/EPNet] Detecting multiple classes (#8) Did you make it work for multiclass-Epnet? I just trained the model with CE loss with class weights. I set the model to detect four classes ['Car', 'Pedestrian', 'Cyclist', 'Truck'] and set the corresponding weights to be [1.0, 3.0, 6.0, 6.0, 12.0] (the first weight is of background class). The evaluation results seemed acceptable, though not good enough. Maybe the weights and other parameters need to be tuned further. — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
Hello @Benedict0819 , I also want to achevice the multi-clas on EPNet, could you please share your code about multi-class to me? My email address is 595603009@qq.com Thank you very much!
Hello @happinesslz , thanks for the great code. I would like to detect multiple classes using EPNet. I modified your code based on the modifications in #3 except the last one. For the last one, I changed https://github.com/happinesslz/EPNet/blob/0123c341243846aa3b412addcb9e2c07fd305237/lib/net/train_functions.py#L236 to
loss_utils.get_reg_loss(torch.max(F.softmax(rcnn_cls_reshape, 1), 1)[0][fg_mask], mask_score[fg_mask],
In addition, after this line https://github.com/happinesslz/EPNet/blob/0123c341243846aa3b412addcb9e2c07fd305237/lib/rpn/proposal_target_layer.py#L69 I addedbatch_cls_label[batch_cls_label == 1] = batch_gt_label_of_rois[batch_cls_label == 1]
to make the label of proposals to be the class label of their corresponding ground truth boxes(1~n) or background(0) instead of 0/1 (and also several other changes to read the labels of gt boxes when reading data and store them in batch_gt_label_of_rois, and several changes in the postprocessing after model inference) After these changes the training process seems to work. However, when I tested the trained model, I found that the model tends to predict every proposal to be the background class. I changed the predicted class to be the class with the max softmax value except the background class and greater than a certain threshold (say 0.05), and the result seems ok. So I guess it's caused by the imbalance between the background class and other classes in training. I tried to increase the number https://github.com/happinesslz/EPNet/blob/0123c341243846aa3b412addcb9e2c07fd305237/tools/cfgs/LI_Fusion_with_attention_use_ce_loss.yaml#L135 to 0.75 and train again but seems no use. Could you please give me some advice on this? Thank you!