Y-oHr-N / OptGBM

Optuna + LightGBM = OptGBM
MIT License
34 stars 7 forks source link

Documentation Help? #94

Open kmedved opened 4 years ago

kmedved commented 4 years ago

Hello - I'm a big fan of this package. It quickly matches the performance I get from much more complicated tuning approaches. However, I was wondering if it would be possible to improve the documentation somewhat? Specifically, I'm interested in seeing how I can see the final parameters selected by the model, and how early stopping is handled? Any other user exposed parameters would also be useful to see documented.

Thanks!

Y-oHr-N commented 4 years ago

Hi @kmedved, Thanks for the feedback.

I was wondering if it would be possible to improve the documentation somewhat?

The documentation will be improved afterwards, as I'm currently trying to merge OptGBM into Optuna in optuna/optuna#1018.

How to see the best hyperparameters

The best hyperparameters found by optuna and the best iteration selected in early stopping can be checked as follows:

import optgbm as lgb
from sklearn.datasets import load_boston

reg = lgb.LGBMRegressor(random_state=0)
X, y = load_boston(return_X_y=True)

reg.fit(X, y)

reg.best_params_
reg.best_iteration_

If param_distributions is set to None, the following parameters will be searched.

Early stopping

OptGBM uses lgb.cv during optimization. This function applies early stopping to CV score. If refit is set to True, OptGBM refits an estimator using the best hyperparameters and the best iteration on the whole dataset.

kmedved commented 4 years ago

Thanks - that's very helpful. It would be nice for Optuna to have a scikit-learn compatible version of the LightGBM Tuner, so looking forward to this merge down the road.

Thanks again for this project, and the reply.