Closed ojasvijain closed 10 months ago
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
I also think getting precision, recall, F1 values would be good.
Hi, Thank you for such a comprehensive library.
I trying to find the metrics using object detection based detected class & ground truth. (FYI - i just have 1 class).
I used the
coco evaluator
to get my metrics. and got the following result:COCO metric: AP [.5:.05:.95]: 0.062852 AP50: 0.109215 AP75: 0.057404 AP Small: nan AP Medium: nan AP Large: 0.063207 AR1: 0.000552 AR10: 0.008840 AR100: 0.076657 AR Small: nan AR Medium: nan AR Large: 0.076657
I want to get the precision, recall & F1 score specifically. On inspecting the
/src/evaluators/coco_evaluator.py
file, I found thatcoco_metrics
variable has the metrics that I need. However, on printing the results from the same I am getting: total positives: 724, TP: 60, FPL 40Thanks!