erdogant / hgboost

hgboost is a python package for hyper-parameter optimization for xgboost, catboost or lightboost using cross-validation, and evaluating the results on an independent validation set. hgboost can be applied for classification and regression tasks.
http://erdogant.github.io/hgboost
Other
57 stars 17 forks source link

Xgboost parameter #7

Closed LAH19999 closed 2 years ago

LAH19999 commented 3 years ago

After using the code hgb.plot_params(), the parameter of learning rate is 796. I don't think it's reasonable. Can I see the model parameters optimized by using HyperOptimized parameters?

QQ截图20210705184733

erdogant commented 3 years ago

Thank you for your feedback. Can you maybe explain why you think that?

All model parameters are stored in results. Example:

from hgboost import hgboost
hg = hgboost(max_eval=10, threshold=0.5, cv=5, test_size=0.2, val_size=0.2, top_cv_evals=10, random_state=None, verbose=3)
df = hg.import_example()
y = df['Survived'].values
del df['Survived']
X = hg.preprocessing(df, verbose=0)

# Fit
results = hg.xgboost(X, y, pos_label=1)

print(hg.results.keys())
#dict_keys(['params', 'summary', 'trials', 'model', 'val_results', 'comparison_results'])

# To see best parameters
hg.results['params']

# To see all hyperparameters that are evaluated
hg.results['summary']
erdogant commented 2 years ago

Please re-open if desired.