IDEA-Research / GroundingDINO

[ECCV 2024] Official implementation of the paper "Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection"
https://arxiv.org/abs/2303.05499
Apache License 2.0
6.72k stars 681 forks source link

All zeros during zero-shot evaluation #196

Open Mukil07 opened 1 year ago

Mukil07 commented 1 year ago

Amazing work !! I just wanted to benchmark it on coco. I was running your zero shot evaluation code with SwinT backbone

CUDA_VISIBLE_DEVICES=0 \ python demo/test_ap_on_coco.py \ -c groundingdino/config/GroundingDINO_SwinT_OGC.py \ -p weights/groundingdino_swint_ogc.pth \ --anno_path /home/c3-0/datasets/coco/annotations/instances_val2017.json \ --image_dir /home/c3-0/datasets/coco/val2017

I was getting this error,

Traceback (most recent call last): File "demo/test_ap_on_coco.py", line 233, in main(args) File "demo/test_ap_on_coco.py", line 145, in main model = load_model(args.config_file, args.checkpoint_path) File "demo/test_ap_on_coco.py", line 29, in load_model model.load_state_dict(clean_state_dict(checkpoint['ema_model']), strict=False) KeyError: 'ema_model'

When i remove the "ema_model" key in the test_ap_on_coco.py. Im getting all zeros during evaluation .

Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.000 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.000 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.000 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.000 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.000 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.000 Final results: [7.943160017818097e-10, 7.943160017818096e-09, 0.0, 0.0, 2.655844983639995e-08, 0.0, 0.0, 0.0, 4.41696113074205e-06, 0.0, 1.524390243902439e-05, 0.0]

Could you please look into this. Thank you

arnavmdas commented 1 year ago

Fixed it by replacing

model.load_state_dict(clean_state_dict(checkpoint['ema_model']), strict=False)

with

model.load_state_dict(clean_state_dict(checkpoint['model']), strict=False).

Seems like you may be evaluating a randomly initialized model.