microsoft / SoftTeacher

Semi-Supervised Learning, Object Detection, ICCV2021
MIT License
892 stars 123 forks source link

Config files for evaluating the provided models #225

Open Bai-YT opened 1 year ago

Bai-YT commented 1 year ago

Hi. Is it possible to share the config files used for evaluating the weights available in through the Google Drive links?

I was trying to reproduce the 44.05% mAP of the Faster R-CNN (ResNet-50) -- Ours (thr=5e-2) experiment. However, I only get Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.324.

The command that I ran was the following:

bash tools/dist_test.sh
/home/ubuntu/project/Detection/SoftTeacher/configs/soft_teacher/soft_teacher_faster_rcnn_r50_caffe_fpn_coco_full_720k_eval.py
/home/ubuntu/project/Detection/SoftTeacher/work_dirs/soft_teacher_faster_rcnn_r50_caffe_fpn_coco_full_720k/coco_iter_720000.pth
1 --eval bbox --cfg-options model.test_cfg.rcnn.score_thr=0.90

The config file is the following:

_base_="base.py"

data = dict(
    samples_per_gpu=8,
    workers_per_gpu=5,
    train=dict(
        sup=dict(
            ann_file="/home/ubuntu/project/data/COCO/annotations/instances_train2017.json",
            img_prefix="/home/ubuntu/project/data/COCO/train2017/",
        ),
    ),
    val=dict(
        ann_file="/home/ubuntu/project/data/COCO/annotations/instances_val2017.json",
        img_prefix="/home/ubuntu/project/data/COCO/val2017/",
    ),
    test=dict(
        ann_file="/home/ubuntu/project/data/COCO/annotations/instances_val2017.json",
        img_prefix="/home/ubuntu/project/data/COCO/val2017/",
    ),

    sampler=dict(
        train=dict(
            sample_ratio=[1, 1],
        )
    )
)

semi_wrapper = dict(
    train_cfg=dict(
        unsup_weight=2.0,
    )
)

optimizer = dict(lr=0.01, weight_decay=1e-4, momentum=0.9)
lr_config = dict(step=[300000, 425000])
runner = dict(_delete_=True, type="IterBasedRunner", max_iters=450000)

Could someone help me out? Thank you. If there is an existing issue about this that I missed, I apologize in advance.