Closed ctwardy closed 7 years ago
Wait, no, fixing #15 didn't fix this. Problem is it's never forecasting 'error'.
precision recall f1-score support
error 0.00 0.00 0.00 240
Confusion Matrix:
error: 7, 7, 8, 68, 1, 16, 133, 0 <-- Note the zero in the last column.
The 'error' entries are showing up as UNCERTAIN or as 'forum'. So 'error' is never rising above threshold. Investigate.
The scikit scores seem to ignore ERROR (ie 'error') in the results. Notice the zeros in the 'error' row below, while the confusion matrix shows some action: