erdogant / hgboost

hgboost is a python package for hyper-parameter optimization for xgboost, catboost or lightboost using cross-validation, and evaluating the results on an independent validation set. hgboost can be applied for classification and regression tasks.
http://erdogant.github.io/hgboost
Other
57 stars 17 forks source link

HP Tuning: best_model uses different parameters from those that were reported as best ones #9

Closed Mikki99 closed 2 years ago

Mikki99 commented 2 years ago

I used hgboost for optimizing the hyper-parameters of my XGBoost model as described in the API References with the following parameters:

hgb = hgboost()
results = hgb.xgboost(X_train, y_train, pos_label=1, method='xgb_clf', eval_metric='logloss')

As noted in the documentation, results is a dictionary that, among other things, returns the best performing parameters (best_params) and the best performing model (model). However, the parameters that the best performing model uses are different from what the function returns as best_params:

best_params

'params': {'colsample_bytree': 0.47000000000000003,
  'gamma': 1,
  'learning_rate': 534,
  'max_depth': 49,
  'min_child_weight': 3.0,
  'n_estimators': 36,
  'subsample': 0.96}

model

'model': XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
               colsample_bynode=1, colsample_bytree=0.47000000000000003,
               enable_categorical=False, gamma=1, gpu_id=-1,
               importance_type=None, interaction_constraints='',
               learning_rate=0.058619090164329916, max_delta_step=0,
               max_depth=54, min_child_weight=3.0, missing=nan,
               monotone_constraints='()', n_estimators=200, n_jobs=-1,
               num_parallel_tree=1, predictor='auto', random_state=0,
               reg_alpha=0, reg_lambda=1, scale_pos_weight=0.5769800646551724,
               subsample=0.96, tree_method='exact', validate_parameters=1,
               verbosity=0),

As you can see, for example, max_depth=49 in the best_params, but the model uses max_depth=54 etc.

Is this a bug or the intended behavior? In case of the latter, I'd really appreciate an explanation!

My setup:

erdogant commented 2 years ago

You are right. The parameters from hyperOpt are inherited for which some are index values and others are not. This is not what is expected. I now store the parameters from the actual models instead of using the hyperOpt summary. This does not change the results of the learned model because the parameters are only stored for the output. Thanks for the issue!

Update to the latest version (>=1.1.0)

pip install -U hgboost