facebookresearch / odin

A simple and effective method for detecting out-of-distribution images in neural networks.
Other
528 stars 102 forks source link

AUPR calculation #6

Closed podgorskiy closed 6 years ago

podgorskiy commented 6 years ago

In https://github.com/facebookresearch/odin/blob/master/code/calMetric.py#L180 precision and recall is computed as:

    tp = np.sum(np.sum(X1 >= delta)) / np.float(len(X1))
    fp = np.sum(np.sum(Y1 >= delta)) / np.float(len(Y1))
    if tp + fp == 0: 
        continue
    precision = tp / (tp + fp)
    recall = tp

But, values tp and fp are TPR and FPR which means that precision is computed as TPR/(TPR + FPR) which seems to be not the same as TP/(TP + FP). Although, those values will coincide when the number of negatives equals the number of positive samples, which is true for all reported datasets, except for iSUN, am I right?

YixuanLi commented 6 years ago

You are right, the values of TPR/(TPR+FPR) and TP/(TP+FP) happen to coincide when the number of negatives equals the number of positive samples. This is the case for all datasets we considered. For iSUN, we downsampled positive examples to be 8925, which is the same as negative count.

YixuanLi commented 6 years ago

Thanks for catching this!