Currently, our implementation of boosting algorithms (e.g., CatBoost, XGBoost) for k-fold cross-validation does not include early stopping. Early stopping prevents overfitting by stopping training when performance no longer improves on a validation set, reducing computation time and improving model generalization.
Planned Implementation
We plan to implement early stopping for KFold in an upcoming pre-release. The following modifications are required:
clf.fit(X, y)
This section will be updated to accept custom fit_params, such as eval_set and verbose, needed for early stopping.
Other Necessary Changes
Similar modifications will be required in other areas to ensure consistency in early stopping across all models and folds.
This will allow for early stopping functionality while maintaining fairness and avoiding potential bias across the k-folds.
Early Stopping K-Fold Issue
Currently, our implementation of boosting algorithms (e.g., CatBoost, XGBoost) for k-fold cross-validation does not include early stopping. Early stopping prevents overfitting by stopping training when performance no longer improves on a validation set, reducing computation time and improving model generalization.
Planned Implementation
We plan to implement early stopping for KFold in an upcoming pre-release. The following modifications are required:
This section will be updated to accept custom
fit_params
, such aseval_set
andverbose
, needed for early stopping.Other Necessary Changes
Similar modifications will be required in other areas to ensure consistency in early stopping across all models and folds.
This will allow for early stopping functionality while maintaining fairness and avoiding potential bias across the k-folds.