Open Enferlain opened 3 months ago
what's your optimizing method?
you can attach your particial screenshot for example:
and this is an example hyperactive app (modified slightly. from https://github.com/SimonBlanke/Hyperactive/blob/master/examples/optimization_applications/hyperpara_optimize.py )
"""
This example shows the original purpose of Hyperactive.
You can search for any number of hyperparameters and Hyperactive
will return the best one after the optimization run.
"""
import numpy as np
from sklearn.model_selection import cross_val_score
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.datasets import load_wine
from hyperactive import Hyperactive
data = load_wine()
X, y = data.data, data.target
def model(opt):
gbr = GradientBoostingClassifier(
n_estimators=opt["n_estimators"],
max_depth=opt["max_depth"],
min_samples_split=opt["min_samples_split"],
min_samples_leaf=opt["min_samples_leaf"],
criterion=opt["criterion"],
)
scores = cross_val_score(gbr, X, y, cv=4)
return scores.mean()
search_space = {
"n_estimators": list(range(10, 150, 5)),
"max_depth": list(range(2, 12)),
"min_samples_split": list(range(2, 25)),
"min_samples_leaf": list(range(1, 25)),
"criterion": ["friedman_mse", "squared_error"],#, "absolute_error"],
"subsample": list(np.arange(0.1, 3, 0.1)),
}
if __name__ == '__main__':
hyper = Hyperactive()
hyper.add_search(model, search_space, n_iter=40)
hyper.run()
and its result:
[0] model (Random Search): 100%|───────────────────────────────────────────────────────────────────────| 40/40 [01:17<00:00, 1.94s/it, best_iter=0, best_pos=[17 5 12 18 0 21], best_score=0.9498737373737374]
Results: 'model'
Best score: 0.9776515151515152
Best parameter set:
'n_estimators' : 130
'max_depth' : 3
'min_samples_split' : 15
'min_samples_leaf' : 22
'criterion' : friedman_mse
'subsample' : 0.7000000000000001
Best iteration: 0
Random seed: 416581482
Evaluation time : 77.45525693893433 sec [99.99 %]
Optimization time : 0.005005359649658203 sec [0.01 %]
Iteration time : 77.46026229858398 sec [1.94 sec/iter]
you can see "Best iteration: 0" in this case (I guess this is a mitake/bug of the hyperactive)
I've ran around 10 auto merges so far, and for some reason at the end I always end up with it saying "best iteration: 0"
Also the entire time seems to be spent on evaluation.
Thoughts?