Closed rohan-gt closed 4 years ago
Yeah, that's a good point. I think we'll probably want to special-case a training call for xgboost, lightgbm as they have a different way of doing early stopping..
@richardliaw I've updated the issue with more details. Btw how does max_iters
perform early stopping? I'm a little confused about how it works in conjunction with n_iter
yeah... i guess that's the penalty we have to pay for adhering to the sklearn API.
max_iters = number of "epochs" n_iter = number of hyperparameter evals.
Does that make sense?
In #63, we're going to enable early stopping for XGBoost via incremental learning. We decided not to implement it for lgbm because it is not yet on a stable version.
Hmm, not sure how we're going to support CatBoost but will open an issue to track lightgbm.
I'm getting the following error while setting
early_stopping=True
These could be a potential fixes: