Open VinaTsai opened 3 years ago
The precision-recall curve shows the tradeoff between precision and recall for different threshold. A high area under the curve represents both high recall and high precision, where high precision relates to a low false positive rate, and high recall relates to a low false negative rate. High scores for both show that the classifier is returning accurate results (high precision), as well as returning a majority of all positive results (high recall).
- Precision-Recall is a useful measure of success of prediction when the classes are very imbalanced.
A system with high recall but low precision returns many results, but most of its predicted labels are incorrect when compared to the training labels. A system with high precision but low recall is just the opposite, returning very few results, but most of its predicted labels are correct when compared to the training labels. An ideal system with high precision and high recall will return many results, with all results labeled correctly.
https://scikit-learn.org/stable/auto_examples/model_selection/plot_precision_recall.html
TP, TN, FP, FN
Accuracy, Precision, Recall and F1 score.
Accuracy = (TP+TN)/(TP+FP+FN+TN)
F1 Score = 2 * (Recall * Precision) / (Recall + Precision)
Conclusion:
References:
https://blog.exsilio.com/all/accuracy-precision-recall-f1-score-interpretation-of-performance-measures/