Open ibrahim1611 opened 2 years ago
You need to modify the evaluation code at here to implement this function.
How should I modify it?
CocoMetric only provides the following following 12 metrics:cocoapi
If you only need precision and recall you can print the metric after this line: https://github.com/open-mmlab/mmdetection/blob/9d3e162459590eee4cfc891218dfbb5878378842/mmdet/core/evaluation/mean_ap.py#L674
thus you can see the precision and recall of each class.
After every checkpoint, an evaluation on the test set is successfully completed and shows the following results. But how can we show only precision and recall (not AP and AR)?
Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.364 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.583 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.416 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = -1.000 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.174 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.373 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = -1.000 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.333 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.605