open-mmlab / mmdetection

OpenMMLab Detection Toolbox and Benchmark
https://mmdetection.readthedocs.io
Apache License 2.0
29.29k stars 9.41k forks source link

How to show only precision and recall at different confidence threshold and IoU threshold #8356

Open ibrahim1611 opened 2 years ago

ibrahim1611 commented 2 years ago

After every checkpoint, an evaluation on the test set is successfully completed and shows the following results. But how can we show only precision and recall (not AP and AR)?

Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.364 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.583 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.416 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = -1.000 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.174 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.373 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = -1.000 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.333 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.605

jbwang1997 commented 2 years ago

You need to modify the evaluation code at here to implement this function.

shuoshuo09 commented 2 years ago

How should I modify it?

wanghonglie commented 1 year ago

CocoMetric only provides the following following 12 metrics:cocoapi

If you only need precision and recall you can print the metric after this line: https://github.com/open-mmlab/mmdetection/blob/9d3e162459590eee4cfc891218dfbb5878378842/mmdet/core/evaluation/mean_ap.py#L674

thus you can see the precision and recall of each class.