IDEA-Research / detrex

detrex is a research platform for DETR-based object detection, segmentation, pose estimation and other visual recognition tasks.
https://detrex.readthedocs.io/en/latest/
Apache License 2.0
1.9k stars 199 forks source link

Pretrained model for dino swin maynot be correct #339

Closed shenyi0220 closed 5 months ago

shenyi0220 commented 5 months ago

Hi authors,

I cannot repro dino swin exps by using "projects/dino/configs/dino-swin/dino_swin_large_384_5scale_12ep.py". Everything is the same except downloading the pretrained backbone from "https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22kto1k.pth"

Is the above link correct? I really suspect the pretrained model is not correct.

The sympton is that training can run without any issue, but after the 1st epoch, the mAP is extremely low:

================= [02/04 04:47:07 d2.evaluation.evaluator]: Inference done 590/625. Dataloading: 0.0017 s/iter. Inference: 0.2730 s/iter. Eval: 0.0005 s/iter. Total: 0.2752 s/iter. ETA=0:00:09 [02/04 04:47:12 d2.evaluation.evaluator]: Inference done 609/625. Dataloading: 0.0017 s/iter. Inference: 0.2727 s/iter. Eval: 0.0005 s/iter. Total: 0.2749 s/iter. ETA=0:00:04 [02/04 04:47:17 d2.evaluation.evaluator]: Total inference time: 0:02:51.055826 (0.275896 s / iter per device, on 8 devices) [02/04 04:47:17 d2.evaluation.evaluator]: Total inference pure compute time: 0:02:48 (0.272303 s / iter per device, on 8 devices) [02/04 04:47:21 d2.evaluation.coco_evaluation]: Preparing results for COCO format ... [02/04 04:47:21 d2.evaluation.coco_evaluation]: Evaluating predictions with unofficial COCO API... Loading and preparing results... DONE (t=3.68s) creating index... index created! [02/04 04:47:26 d2.evaluation.fast_eval_api]: Evaluate annotation type bbox [02/04 04:47:36 d2.evaluation.fast_eval_api]: COCOeval_opt.evaluate() finished in 9.82 seconds. [02/04 04:47:36 d2.evaluation.fast_eval_api]: Accumulating evaluation results... [02/04 04:47:38 d2.evaluation.fast_eval_api]: COCOeval_opt.accumulate() finished in 1.32 seconds. Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.000 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.008 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.011 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.011

shenyi0220 commented 5 months ago

Resolved. The root cause is that the detectron2 repo was accidentally updated

shenyi0220 commented 5 months ago

Close this as resolved