This PR introduces the new eval_metrics.py script, which includes the report_model_metrics function. This function generates a DataFrame of model performance metrics based on provided input data, models, and predictions. It supports multiple modes of operation, including:
Evaluating metrics for outcome columns and predicted probabilities from a DataFrame.
Calculating metrics for specified models using validation data.
Computing metrics for a DataFrame of predicted probabilities.
Key Features:
Metrics calculated include:
Precision/PPV
Average Precision
Sensitivity (Recall)
Specificity
AUC ROC
Brier Score
Supports both models and individual prediction columns for flexible evaluation.
Adds functionality to compute metrics across a variety of input configurations, including predicted probability columns and actual models.
New File:
eval_metrics.py: Contains the report_model_metrics function.
Evaluation Metrics
This PR introduces the new eval_metrics.py script, which includes the
report_model_metrics
function. This function generates a DataFrame of model performance metrics based on provided input data, models, and predictions. It supports multiple modes of operation, including:Key Features:
Metrics calculated include:
Supports both models and individual prediction columns for flexible evaluation.
Adds functionality to compute metrics across a variety of input configurations, including predicted probability columns and actual models.
New File:
eval_metrics.py
: Contains thereport_model_metrics
function.