Closed Mikki99 closed 2 years ago
You are right. The parameters from hyperOpt are inherited for which some are index values and others are not. This is not what is expected. I now store the parameters from the actual models instead of using the hyperOpt summary. This does not change the results of the learned model because the parameters are only stored for the output. Thanks for the issue!
Update to the latest version (>=1.1.0)
pip install -U hgboost
I used
hgboost
for optimizing the hyper-parameters of my XGBoost model as described in the API References with the following parameters:As noted in the documentation,
results
is a dictionary that, among other things, returns the best performing parameters (best_params
) and the best performing model (model
). However, the parameters that the best performing model uses are different from what the function returns asbest_params
:best_params
model
As you can see, for example,
max_depth=49
in thebest_params
, but themodel
usesmax_depth=54
etc.Is this a bug or the intended behavior? In case of the latter, I'd really appreciate an explanation!
My setup: