Open jrmedc opened 3 weeks ago
Hello,
I am re-iterating my request, I have tried modifying coco_metrics and cocoeval but without success,
Has anyone evaluated a model performances with a custom IoU threshold for the AP or mAP metrics? I have tried the iou_thrs attribute for test_evaluator in the config file but didn't get the desired result (as mentioned in the above message)
Hello,
I am trying to run test a mode I trained and get mAP for custom IoU threshold, The model is Co-DETR (detection)
I am calling the test-evaluator like this, I also added the iou_thrs param in val_evaluator test_evaluator = dict( ann_file= '',
backend_args=None,
format_only=False,
metric='bbox',
iou_thrs=[0.3, 0.4, 0.5, 0.8],
type='CocoMetric')
But then in the results I don't have any results for IoU threshold of 0.4 or 0.3 Here are the results I have Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.181 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.402 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.153 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = -1.000 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.021 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.193 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.421 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.421 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.421 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = -1.000 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.039 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.439
Is there something I am doing wrong? Should the custom IoU I declared be reflected in the results displayed?
Thanks in advance for the help