Czm369 / MixPL

Mixed Pseudo Labels for Semi-Supervised Object Detection
Apache License 2.0
47 stars 0 forks source link

About reproduction results with Faster-RCNN on 10% label data on coco dataset #12

Open tamama9018 opened 1 month ago

tamama9018 commented 1 month ago

Thanks for your great work! I reproduce results on 10% label data on coco dataset for Faster-RCNN and mAP gave the following results.

2024/03/15 20:47:31 - mmengine - INFO - bbox_mAP_copypaste: 0.112 0.226 0.097 0.054 0.123 0.151
2024/03/15 20:47:31 - mmengine - INFO - Iter(val) [5000/5000]    teacher/coco/bbox_mAP: 0.1490  teacher/coco/bbox_mAP_50: 0.2720  teacher/coco/bbox_mAP_75: 0.1480  teacher/coco/bbox_mAP_s: 0.0790  teacher/coco/bbox_mAP_m: 0.1580  teacher/coco/bbox_mAP_l: 0.2010  student/coco/bbox_mAP: 0.1120  student/coco/bbox_mAP_50: 0.2260  student/coco/bbox_mAP_75: 0.0970  student/coco/bbox_mAP_s: 0.0540  student/coco/bbox_mAP_m: 0.1230  student/coco/bbox_mAP_l: 0.1510  data_time: 0.0071  time: 0.0399
2024/03/15 20:47:31 - mmengine - INFO - Saving checkpoint at 1 epochs

This is my train log: 20240314_032925.log

I don't seem to have reached the mAP described in the paper (37.16 ± 0.15 ), am I doing it right? I would be happy to receive a reply.

Czm369 commented 1 month ago

https://huggingface.co/czm369/MixPL/tree/main/mixpl_faster-rcnn_r50-caffe_fpn_180k_coco-s1-p10.py

tamama9018 commented 1 month ago

Which script did you use to split the COCO data set? Also, which Validation dataset did you use?

bharanibala commented 2 weeks ago

Thanks for your great work! I reproduce results on 10% label data on coco dataset for Faster-RCNN and mAP gave the following results.

2024/03/15 20:47:31 - mmengine - INFO - bbox_mAP_copypaste: 0.112 0.226 0.097 0.054 0.123 0.151
2024/03/15 20:47:31 - mmengine - INFO - Iter(val) [5000/5000]    teacher/coco/bbox_mAP: 0.1490  teacher/coco/bbox_mAP_50: 0.2720  teacher/coco/bbox_mAP_75: 0.1480  teacher/coco/bbox_mAP_s: 0.0790  teacher/coco/bbox_mAP_m: 0.1580  teacher/coco/bbox_mAP_l: 0.2010  student/coco/bbox_mAP: 0.1120  student/coco/bbox_mAP_50: 0.2260  student/coco/bbox_mAP_75: 0.0970  student/coco/bbox_mAP_s: 0.0540  student/coco/bbox_mAP_m: 0.1230  student/coco/bbox_mAP_l: 0.1510  data_time: 0.0071  time: 0.0399
2024/03/15 20:47:31 - mmengine - INFO - Saving checkpoint at 1 epochs

This is my train log: 20240314_032925.log

I don't seem to have reached the mAP described in the paper (37.16 ± 0.15 ), am I doing it right? I would be happy to receive a reply.

Hello,

Hope you are doing well! I am also getting the same results and I am not sure what next? Could you please help me if you know about this?

Thanks, Bharani.

bharanibala commented 2 weeks ago

https://huggingface.co/czm369/MixPL/tree/main/mixpl_faster-rcnn_r50-caffe_fpn_180k_coco-s1-p10.py

Hello,

Thanks for your great work on the algorithm! I am following your approach. But, I am getting TypeError: MeanTeacherHook.init() got an unexpected keyword argument 'gamma'. Could you please suggest me a work around?

Thanks, Bharani.

bharanibala commented 3 days ago

https://huggingface.co/czm369/MixPL/tree/main/mixpl_faster-rcnn_r50-caffe_fpn_180k_coco-s1-p10.py

Hi,

I could not find the AnnealMeanTeacherHook module. Is it fine if we use the MeanTeacherHook instead? Could you please help me on this?

Thanks, Bharani.

Czm369 commented 3 days ago

AnnealMeanTeacherHook just add a linear warmup for MeanTeacher, so you can use the MeanTeacherHook and not affect performance.