rafaelpadilla / review_object_detection_metrics

Object Detection Metrics. 14 object detection metrics: mean Average Precision (mAP), Average Recall (AR), Spatio-Temporal Tube Average Precision (STT-AP). This project supports different bounding box formats as in COCO, PASCAL, Imagenet, etc.
Other
1.08k stars 215 forks source link

Zero scores #91

Closed alsheabi closed 2 years ago

alsheabi commented 2 years ago

I used gt and dt files with IoU 0.5 in both of https://github.com/rafaelpadilla/Object-Detection-Metrics and https://github.com/Cartucho/mAP with getting mAP 97%, but when I used this repo I get Zero !!!

GT files (Abs .txt): class_name xyx2y2 DR files (Abs .txt): class_name Confidence xyx2y2

When I set IoU 0.1 get score 10%

Result using this Repo:

iou

Result using @rafaelpadilla rafaelpadilla https://github.com/rafaelpadilla/Object-Detection-Metrics

iou1

github-actions[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

ooobelix commented 2 years ago

Hi @alsheabi, is it the same behaviour as https://github.com/rafaelpadilla/review_object_detection_metrics/issues/68 ?

alsheabi commented 2 years ago

Hi @alsheabi, is it the same behaviour as #68 ?

@ooobelix Hi, I think it's different because when I set Threshold IoU <0.1, I get accuracy!

github-actions[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.