Open yichuangzhang opened 2 years ago
COCO evaluation metric bbox
is used to evaluate coco dataset, which set the iou_thr = [0.5, 0.95, 0.05]. It used pycocotools to evaluate the result.
VOC evaluation metric mAP
set iou_thr=0.5 as default and using 11points method to calculate AP. The evaluation code is at mmdet/core/evaluation/mean_ap.py
For more details, you can have a look at the dataset.evaluate
code in mmdet/datasets such as:
https://github.com/open-mmlab/mmdetection/blob/70f6d9cfade4a2f0b198e4f64776521d181b28be/mmdet/datasets/voc.py#L34-L41
and
https://github.com/open-mmlab/mmdetection/blob/70f6d9cfade4a2f0b198e4f64776521d181b28be/mmdet/datasets/coco.py#L386-L395
Thanks for your error report and we appreciate it a lot.
Checklist
Describe the bug A clear and concise description of what the bug is.
Reproduction
Environment
python mmdet/utils/collect_env.py
to collect necessary environment information and paste it here.$PATH
,$LD_LIBRARY_PATH
,$PYTHONPATH
, etc.)Error traceback If applicable, paste the error trackback here.
Bug fix If you have already identified the reason, you can provide the information here. If you are willing to create a PR to fix it, please also leave a comment here and that would be much appreciated!