erdogant / hgboost

hgboost is a python package for hyper-parameter optimization for xgboost, catboost or lightboost using cross-validation, and evaluating the results on an independent validation set. hgboost can be applied for classification and regression tasks.
http://erdogant.github.io/hgboost
Other
57 stars 17 forks source link

'n_estimators' value range not respected #20

Closed CarterwoodAnalytics closed 2 months ago

CarterwoodAnalytics commented 2 months ago

Description

Whenever I run my script using HGBoost, the n_estimators value returned is never within the specified range. In fact, it is often no higher than 39. Below is the code snippet I am using for setting the parameters.

Code Snippet

from hyperopt import hp
import numpy as np

xgb_reg_params = {
    'learning_rate': hp.quniform('learning_rate', 0.05, 0.31, 0.05),
    'max_depth': hp.choice('max_depth', np.arange(5, 30, 1, dtype=int)),
    'min_child_weight': hp.choice('min_child_weight', np.arange(1, 10, 1, dtype=int)),
    'gamma': hp.choice('gamma', [0, 0.25, 0.5, 1.0]),
    'reg_lambda': hp.choice('reg_lambda', [0.1, 1.0, 5.0, 10.0, 50.0, 100.0]),
    'subsample': hp.uniform('subsample', 0.5, 1),
    'n_estimators': hp.choice('n_estimators', range(40, 205, 5)),
    'early_stopping_rounds': 25
}

Expected Behaviour The n_estimators value should be within the range of 40 to 205 as specified.

Actual Behaviour The n_estimators value is no higher than 39, which is outside the specified range.

Environment details HGBoost version: 1.1.5 Numpy version: 1.26.4 Python version: 3.11

CarterwoodAnalytics commented 2 months ago

I may have made the mistake of taking the index as the actual value but I'm not sure since 'learning_rate': hp.quniform('learning_rate', 0.05, 0.31, 0.05) returns a value of 0.05.