all-contributors / ac-learn

ML platform for all contributors
MIT License
5 stars 4 forks source link

Confusion Matrix correction #23

Closed Berkmann18 closed 5 years ago

Berkmann18 commented 5 years ago

At the moment the Precision/Recall/Accuracy/F1-score calculation seems bizarrely wrong and I managed to reproduce that on a spreadsheet using what I understood to be the correct way when it comes to confusion matrix statistics in the multi-class setting. Limdu's one seems to be in fact correct but it accounts for all labels predicted whereas I just look at the first one of in the arrays of predicted labels (assuming it's the most likely to be correct).

I don't know what exactly am I missing and if sticking to a proper multi-label (as opposed to just a multi-class) approach is better (even tho the learner will only care about one label per prediction).

Berkmann18 commented 5 years ago

Closing as it just seems to be due to the bias towards null and the lack of balance with the rest.