Closed dchaley closed 7 months ago
Example output:
precision recall f1-score support
False 0.81 0.83 0.82 31733
True 0.98 0.97 0.97 230411
accuracy 0.95 262144
macro avg 0.89 0.90 0.90 262144
weighted avg 0.96 0.95 0.96 262144
To test correctness, we can use a sklearn classification report. This provides similar, but more data to a confusion matrix. It includes f-scores, etc.
Write a notebook that demonstrates the process. Read our selected training data, repredict the corresponding input, and generate a classification report.