MathGaron / mean_average_precision

Small and simple python/numpy utility to compute mean average precision (mAP) on detection task.
MIT License
117 stars 45 forks source link

some bugs in ap_accumulator #9

Closed NPetsky closed 6 years ago

NPetsky commented 6 years ago

Hello, I have found two bugs (I had to search for quite a long time to find the first :) ) First: In both precision and recall the return is an integer division which yields an integer in python 2 (not in python 3, here they have corrected that issue) so you have to cast it to a float to get something meaningful Second: In recall the total_gt can be zero if a class doesn't exist in an image, so you have to prevent the division by zero in return Here is my corrected script, hope I could help with it :)

""" Simple accumulator class that keeps track of True positive, False positive and False negative to compute precision and recall of a certain class """

class APAccumulator: def init(self): self.TP, self.FP, self.FN = 0, 0, 0

def inc_good_prediction(self, value=1):
    self.TP += value

def inc_bad_prediction(self, value=1):
    self.FP += value

def inc_not_predicted(self, value=1):
    self.FN += value

@property
def precision(self):
    total_predicted = self.TP + self.FP
    if total_predicted == 0:
        total_gt = self.TP + self.FN
        if total_gt == 0:
            return 1
        else:
            return 0
    return float(self.TP) / total_predicted

@property
def recall(self):
    total_gt = self.TP + self.FN
    if total_gt == 0:
        return 0
    return float(self.TP) / total_gt

def __str__(self):
    str = ""
    str += "True positives : {}\n".format(self.TP)
    str += "False positives : {}\n".format(self.FP)
    str += "False Negatives : {}\n".format(self.FN)
    str += "Precision : {}\n".format(self.precision)
    str += "Recall : {}\n".format(self.recall)
    return str
MathGaron commented 6 years ago

Great! thanks for the bugfixes, I will update/test this tonight.

I actually never debugged the code with python2, and I realize I did not mention it anywhere... I guess that you are using it with python2?

MathGaron commented 6 years ago

I just pushed the change, however for the recall condition, I think it makes more sense to return 1 if there is no ground truth. Recall is the % of ground truth retrieved, so if there is 0 ground truth, it makes more sense to return 100% recall. Also x/0 => infinity. That said, this edge case seems hardly defined, and I can't find any paper/information about it... If anyone has a better idea/reference it would be great to mention it here!