Closed shehrozskhan closed 4 years ago
Hi @titubeta, the predict
function will actually return the real-valued scores (unlike sklearn
where predict
returns labels). To get binary labels you would need to threshold the predictions (predictions >= 0
are positive and predictions < 0
are negative).
Thanks, so in my understanding, these scores can be used to compute AUC.
That's correct; you should be able to use these scores directly to compute AUC. If you want (uncalibrated) confidence values between 0 and 1, you can feed them through a sigmoid function or something similar, but those aren't necessary for AUC.
Hello, is there a way to obtain scores of the predictions instead of labels? As you may guess, I am interested in computing AUC, rather than accuracy or similar metric. When I create the classifier object with MISVM, I can see a score(X,y) function. However, when I pass the test bags as X and test bag labels as y, it gives me error.