microsoft / SoftTeacher

Semi-Supervised Learning, Object Detection, ICCV2021
MIT License
899 stars 123 forks source link

Queries on Weights for Classification Loss on Background Boxes #145

Open anuj-sharma-19 opened 2 years ago

anuj-sharma-19 commented 2 years ago

Hi,

First of all congratulations for the great work and many thanks for sharing the code !!

I have been checking out the code, and have a query regarding the weight for classification loss on background proposals.

The classification loss on pseudo labels seems to be done here https://github.com/microsoft/SoftTeacher/blob/bef9a256e5c920723280146fc66b82629b3ee9d4/ssod/models/soft_teacher.py#L240

where the bbox_targets are computed a couple of lines earlier here https://github.com/microsoft/SoftTeacher/blob/bef9a256e5c920723280146fc66b82629b3ee9d4/ssod/models/soft_teacher.py#L215

which I understand refers to the bbox_head.py in mmdetection here https://github.com/open-mmlab/mmdetection/blob/bde7b4b7eea9dd6ee91a486c6996b2d68662366d/mmdet/models/roi_heads/bbox_heads/bbox_head.py#L183

which further calls _get_target_single() here https://github.com/open-mmlab/mmdetection/blob/bde7b4b7eea9dd6ee91a486c6996b2d68662366d/mmdet/models/roi_heads/bbox_heads/bbox_head.py#L117

But in this, the negative proposals are assigned weight of 1.0, which should have been cls_score for the proposals after running them through the Teacher, as mentioned in the paper.

Maybe I am missing something in the code. It would be really great if you could kindly clarify the above query or point me to where it uses the cls-score from the teacher into the classification loss for background proposals.

Thank You !!

Best Regards, Anuj

anuj-sharma-19 commented 2 years ago

Okay, I can see it now where it is updated to background cls-score from the teacher https://github.com/microsoft/SoftTeacher/blob/bef9a256e5c920723280146fc66b82629b3ee9d4/ssod/models/soft_teacher.py#L235

anuj-sharma-19 commented 2 years ago

In the final classification loss here https://github.com/microsoft/SoftTeacher/blob/bef9a256e5c920723280146fc66b82629b3ee9d4/ssod/models/soft_teacher.py#L243

it seems loss is:

# where `w` is the cls-score from teacher for background proposal
total_cls_loss = sum(cls_loss_on_all_fg_proposals) + sum(w * cls_loss_on_bg_proposals)
avg_factor = N_fg + sum(w's on bg proposals)
loss_cls = total_cls_loss / avg_factor

So, avg_factor is not as N_fg in the paper. Also, the w_j as used in the paper (Eq 5) does not seem to be used in the code.

It would be really great if you could you please clarify these couple of doubts.

Thank You !!

Anuj

MendelXu commented 2 years ago

Yes, you are right. The typo should have been fixed in the paper but I am not sure why it isn't. I will discuss this with my collaborators.

anuj-sharma-19 commented 2 years ago

Great. Thanks a lot for the quick clarifications !! :+1:

jackhu-bme commented 2 years ago

Very good and important question. I'm curious about the relationship between weight setting and the final mAP performance. This can also benefit my work. I comment here simply to get the latest feedback.

lliuz commented 2 years ago

@anuj-sharma-19 @Jack-Hu-2001 The correct equation, as I understand it, is: image @MendelXu please check it.

anuj-sharma-19 commented 2 years ago

@lliuz Hi, yes, the above equation matches the code correctly.