Open LouisAUTHIE opened 2 months ago
Good observation. The Accuracy Metric scores the predictions differently that the accuracy value computed from the confusion matrix (in Multiclass Breakdown). Specifically the Metric only considers true positives while the Report considers both true positives and negatives. I tried to show the differences by providing the formulas in the docs but they could be wrong - I don't consider myself a mathematician. Perhaps someone with more experience can give it a look.
In the Accuracy documentation, the formula presented is not the one coded in the class. https://github.com/RubixML/ML/blob/master/src/CrossValidation/Metrics/Accuracy.php