ARM-software / mango

Parallel Hyperparameter Tuning in Python
Apache License 2.0
335 stars 40 forks source link

'Tuner' object has no attribute 'getConf' #88

Closed itsceleste7 closed 1 year ago

itsceleste7 commented 1 year ago

Hello and thanks for this project!

I am trying to run the code in 'readme' and the example i used is 'Hyperparameter Tuning Example' which is provided in the code.

My code is the same as yours: from sklearn import datasets from sklearn.neighbors import KNeighborsClassifier from sklearn.model_selection import cross_val_score

from mango import Tuner, scheduler

search space for KNN classifier's hyperparameters

n_neighbors can vary between 1 and 50, with different choices of algorithm

param_space = dict(n_neighbors=range(1, 50), algorithm=['auto', 'ball_tree', 'kd_tree', 'brute'])

@scheduler.serial def objective(params): X, y = datasets.load_breast_cancer(return_X_y=True) clf = KNeighborsClassifier(params) score = cross_val_score(clf, X, y, scoring='accuracy').mean() return score

tuner = Tuner(param_space, objective) results = tuner.maximize() print('best parameters:', results['best_params']) print('best accuracy:', results['best_objective'])

=> best parameters: {'algorithm': 'auto', 'n_neighbors': 11}

=> best accuracy: 0.931486122714193

However, i got a better accuracy(best parameters: {'algorithm': 'kd_tree', 'n_neighbors': 13},best accuracy: 0.9332401800962584). Since you have mentioned that'Note that best parameters may be different but accuracy should be ~ 0.9315.' Therefore i wonder why i could get the better result.

螢幕截圖 2023-01-20 15 32 44

I am looking forwards to the reply. Thank you!

tihom commented 1 year ago

@itsceleste7 thanks for point this out, I am also getting the better accuracy now. It could be due to updates on KNN classifier in sklearn. Will update the README accordingly.