THU-MIG / yolov10

YOLOv10: Real-Time End-to-End Object Detection [NeurIPS 2024]
https://arxiv.org/abs/2405.14458
GNU Affero General Public License v3.0
9.96k stars 989 forks source link

yolov10n verification of COCO test2017 MAP bug #356

Open yang-0201 opened 4 months ago

yang-0201 commented 4 months ago

When I used the official code expecting to get the accuracy of yolov10n in COCO test2017, I used the 'coco.yaml' configuration file, and reasoned about the test set, the code is:

`from ultralytics import YOLOv10

if name == 'main': model = YOLOv10('yolov10n.pt') model.val(data='coco.yaml', device=0,split='test',save_json=True)`

then I submitted the saved json file to the official measurement system:https://codalab.lisn.upsaclay.fr/competitions/7384#participate-submit_results. The submission results showed that the MAP was almost always 0. After troubleshooting it was determined that the category id was the problem. There is a bug in the code:

in ultralytics/models/yolo/detect/val.py self.is_coco = isinstance(val, str) and "coco" in val and val.endswith(f"{os.sep}val2017.txt")

because the current code only considers val2017.txt as a coco dataset and does the label transformation, but if it is test-dev2017.txt can not be recognized as coco, resulting in category information can not be transformed. Change the code to:

self.is_coco = isinstance(val, str) and "coco" in val and (val.endswith(f"{os.sep}val2017.txt") or val.endswith(f"{os.sep}test-dev2017.txt"))

After modifying the code, you can get the correct result: yolov10n-cocotest2017.txt

overall performance Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.387 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.541 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.421 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.174 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.417 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.537 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.323 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.537 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.585 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.328 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.637 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.779 Done (t=508.27s)

yang-0201 commented 4 months ago

I've resolved the issue and submitted a PR #357