open-mmlab / mmdetection

OpenMMLab Detection Toolbox and Benchmark
https://mmdetection.readthedocs.io
Apache License 2.0
29.1k stars 9.38k forks source link

How does MMDetection calculate Precision in a one-class classification? #11888

Open CarolinRue opened 1 month ago

CarolinRue commented 1 month ago

Hey,

I have a question of understanding. I did a one-class classification and for that trained a model with MMDetection.

In the case of one-class classification I only know the positive samples. Then I can only calculate TruePositive and FalseNegative and therefore only recall. I don't know FalsePositive (Predict positive when the actual value is negative) and TrueNegative (Predict negative when the actual value is negative) if I don't have a negative class. That's why I can't calculate Precision. Am I wrong?

But MMDetection calculates precision in the training. How is precision calculated if I do not have FalsePositive?

Unbenannt

When I am printing the confusion matrix of the testing, FalsePositive is always 100 % and TrueNegative 0 %. How can FalsePositive be 100% if I have no negative samples? Does the background always count as negative? confusion_matrix_0 1_0 5

Maybe I'm missing something, but I don't get it at the moment.

slantingsun commented 1 month ago

Maybe you can use

TP = np.diag(confusion_matrix)
FP = np.sum(confusion_matrix, axis=0) - TP
FN = np.sum(confusion_matrix, axis=1) - TP

precision = TP / (TP + FP)
recall = TP / (TP + FN)
average_precision = np.mean(precision[:-1])
average_recall = np.mean(recall[:-1])
f1 = 2 * (average_precision * average_recall) / (average_precision + average_recall)

I cannot ensure accuracy