Closed LAH19999 closed 2 years ago
Thank you for your feedback. Can you maybe explain why you think that?
All model parameters are stored in results. Example:
from hgboost import hgboost
hg = hgboost(max_eval=10, threshold=0.5, cv=5, test_size=0.2, val_size=0.2, top_cv_evals=10, random_state=None, verbose=3)
df = hg.import_example()
y = df['Survived'].values
del df['Survived']
X = hg.preprocessing(df, verbose=0)
# Fit
results = hg.xgboost(X, y, pos_label=1)
print(hg.results.keys())
#dict_keys(['params', 'summary', 'trials', 'model', 'val_results', 'comparison_results'])
# To see best parameters
hg.results['params']
# To see all hyperparameters that are evaluated
hg.results['summary']
Please re-open if desired.
After using the code hgb.plot_params(), the parameter of learning rate is 796. I don't think it's reasonable. Can I see the model parameters optimized by using HyperOptimized parameters?