Xtra-Computing / thundersvm

ThunderSVM: A Fast SVM Library on GPUs and CPUs
Apache License 2.0
1.56k stars 216 forks source link

Cross-validation with different criteria than the accuracy metric #150

Open aaroncaffrey opened 5 years ago

aaroncaffrey commented 5 years ago

Feature suggestion:

Currently, as far as I can tell, cross-validation is performed using only the accuracy metric. However, the accuracy metric can be very misleading for imbalanced classes where the number of training examples differs. For example, with 90 positive and 10 negative training examples, a 90% accuracy can be achieved by only correctly predicting the positive class.

Therefore, it is desirable to be able to cross-validate against other metrics, such as F1, Recall, Precision, MCC, Average Precision (equal to AUC Precision-Recall), AUC ROC, and etc.

Is it possible in future to add a command line parameter to the training program to select such an alternative cross-validation performance metric?

Libsvm has a similar optional module here: https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/eval/index.html

Thank you.

zeyiwen commented 5 years ago

Thanks! We will improve it. Please stay tuned.