rafaelpadilla / review_object_detection_metrics

Object Detection Metrics. 14 object detection metrics: mean Average Precision (mAP), Average Recall (AR), Spatio-Temporal Tube Average Precision (STT-AP). This project supports different bounding box formats as in COCO, PASCAL, Imagenet, etc.
Other
1.08k stars 215 forks source link

No results #121

Closed Alioth-1701 closed 1 year ago

Alioth-1701 commented 1 year ago

I entered the information as the instructions told, but there were no results after running the program. Every value shows as 0, and noting is drawn on the graphs.

{P_%@3BIL_XTP~L`U4J(%)F

The result looks like this:

_WKUQJCL51E04@3582652QM

The function of "show detections statistics" shows perfectly the bounding boxes. So, I am unsure if it was with my class, which is only one class, or if there are other issues. And I wrote the class file like this:

Z XI5$H9ZCHX5_JMGLBIZAM

Please help me with it. Thank you so much

rafaelpadilla commented 1 year ago

Could you please share some of your images, detection and ground-truth files?

kalikhademi commented 1 year ago

I tried the evaluators in the script and it shows zero for all metrics too.

groundtruth_bbs =coco2bb(annotation.json, bb_type=BBType.GROUND_TRUTH) detected_bbs = coco2bb(detection.json, bb_type=BBType.DETECTED) result = get_coco_summary(groundtruth_bbs,detected_bbs)

I appreciate it if you could let me know what the problem might be!

Thanks,

rafaelpadilla commented 1 year ago

@kalikhademi Could you please share some of your images, detection and ground-truth files?

github-actions[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.