Closed 0nyoun9 closed 6 months ago
When you already have pred_labels (0 or 1), you should evaluate other metrics, like precision instead of AUC. AUC can be seen as an average performance metric under different thresholds (these thresholds will decide 0 or 1 labels).
When you already have pred_labels (0 or 1), you should evaluate other metrics, like precision instead of AUC. AUC can be seen as an average performance metric under different thresholds (these thresholds will decide 0 or 1 labels).
i get it,thank you!
Hi, thanks for your excellent work! I saw in your code that you calculate the AUC score using
gt_labels
andtotal_topk_err_scores
.auc_score = roc_auc_score(gt_labels, total_topk_err_scores)
Can I usegt_labels
andpred_labels
to calculate the AUC score