hgboost is a python package for hyper-parameter optimization for xgboost, catboost or lightboost using cross-validation, and evaluating the results on an independent validation set. hgboost can be applied for classification and regression tasks.
Whenever I run my script using HGBoost, the n_estimators value returned is never within the specified range. In fact, it is often no higher than 39. Below is the code snippet I am using for setting the parameters.
I may have made the mistake of taking the index as the actual value but I'm not sure since 'learning_rate': hp.quniform('learning_rate', 0.05, 0.31, 0.05) returns a value of 0.05.
Description
Whenever I run my script using HGBoost, the n_estimators value returned is never within the specified range. In fact, it is often no higher than 39. Below is the code snippet I am using for setting the parameters.
Code Snippet
Expected Behaviour The n_estimators value should be within the range of 40 to 205 as specified.
Actual Behaviour The n_estimators value is no higher than 39, which is outside the specified range.
Environment details HGBoost version: 1.1.5 Numpy version: 1.26.4 Python version: 3.11