shankarpandala / lazypredict

Lazy Predict help build a lot of basic models without much code and helps understand which models works better without any parameter tuning
MIT License
3.03k stars 344 forks source link

ROC-AUC calculation #425

Open aybarsnazlica opened 1 year ago

aybarsnazlica commented 1 year ago

According to scikit-learn documentation roc_auc_score function takes target probability scores from estimator.predict_proba(X, y)[:, 1]. However, in Supervised.py roc_auc_score takes binary predictions. This changes the output from roc_auc_score. Is there a specific reason for this, or is it a bug?

In Supervised.py y_pred = pipe.predict(X_test) ...
roc_auc = roc_auc_score(y_test, y_pred)

https://scikit-learn.org/stable/modules/generated/sklearn.metrics.roc_auc_score.html#sklearn.metrics.roc_auc_score

mohdelite commented 1 year ago

image I use the "0.2.12" version of lazy predict and for ROC AUC returned None! Do you have any idea why this problem happened