Closed JohannesEsslinger closed 2 years ago
Hi, the coco dataset evaluation is used pycocotools. For more details, you may have a look at the source code of pycocotools.
Here is the evaluation code in mmdet:
https://github.com/open-mmlab/mmdetection/blob/3e2693151add9b5d6db99b944da020cba837266b/mmdet/datasets/coco.py#L592-L616
in which proposal_nums
is the maxdet list
Yeah already looked at it, but couldn't figure it out.
I haven't look the source code before, I'll have a look at the code and re-answer this issue later
For 1.) It is for a single image
For 3.) You cant change it in mmedetection. The mismatch in MaxDets comes from pycocotools, where maxDets for the first mAP is hardcoded. Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.839 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.984
See: https://github.com/cocodataset/cocoapi/issues/558
for 2.) But you can easily change the maxDets parameter in the code for pycocotools.
I get now the following result for different maxDets Average Precision (AP) @[ IoU=0.50 | area= all | maxDets= 1 ] = 0.040 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets= 10 ] = 0.351 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.977
So maxDets is an important parameter that strongly influence the average recall and precision metric.
But i still don't get it why the recall is so high (see question 4.)) for maxDets = 10. Furthermore i am getting more confused, because average precision increases with higher maxDets. I thought that only the best detections getting evaluated and therefore the average precision with small maxDets should be close to 1.
I hope that anybody can explain that.
In case anyone else finds this issue, I believe the answer is here: https://github.com/cocodataset/cocoapi/issues/222#issuecomment-1898909053
Hello,
i have some questions about the MaxDets Parameter in the Evaluation.
I understand that maxDets typically refers to the maximum number of proposals for the evaluation, but: 1.) Does the parameter determine the count for max proposals for evaluating a single image? 2.) I thought that MaxDets is only relevant for the Recall. Why does the mAP also depend on maxDets. the terminal prints: Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.839 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.984 3.) How can you change maxDets for Precision? For Recall you can simply define it in the config file for example: evaluation=dict( classwise=True, proposal_nums=(10, 100, 1000), etc... 4.) How can it be possible to have the following result, if using only maxDets=10, but every image in the dataset has at least 70 objects in it? Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.323
Hope you can help me out! Thank you already!