Open johnnyzheng0636 opened 6 months ago
It's my first time using hyperopt-sklearn. But when I copied the two example and run them, iris example give me error and MNIST example give me different result.
I ran it in a new venv with following
Package Version Editable project location ------------------ ------------ ---------------------------------------------------------------------- asttokens 2.4.1 cloudpickle 3.0.0 colorama 0.4.6 comm 0.2.1 debugpy 1.8.0 decorator 5.1.1 exceptiongroup 1.2.0 executing 2.0.1 future 0.18.3 hpsklearn 1.0.3 C:\Users\abc\hyperopt-sklearn hyperopt 0.2.7 importlib-metadata 7.0.1 ipykernel 6.29.0 ipython 8.18.1 jedi 0.19.1 joblib 1.3.2 jupyter_client 8.6.0 jupyter_core 5.7.1 matplotlib-inline 0.1.6 nest-asyncio 1.5.9 networkx 3.2.1 numpy 1.26.3 packaging 23.2 pandas 2.1.4 parso 0.8.3 pip 22.0.4 platformdirs 4.1.0 prompt-toolkit 3.0.43 psutil 5.9.7 pure-eval 0.2.2 py4j 0.10.9.7 Pygments 2.17.2 python-dateutil 2.8.2 pytz 2023.3.post1 pywin32 306 pyzmq 25.1.2 scikit-learn 1.3.2 scipy 1.11.4 setuptools 58.1.0 six 1.16.0 stack-data 0.6.3 threadpoolctl 3.2.0 tornado 6.4 tqdm 4.66.1 traitlets 5.14.1 typing_extensions 4.9.0 tzdata 2023.4 wcwidth 0.2.13 zipp 3.17.0
the iris one give:
from hpsklearn import HyperoptEstimator, any_classifier, any_preprocessing from sklearn.datasets import load_iris from hyperopt import tpe import numpy as np # Download the data and split into training and test sets iris = load_iris() X = iris.data y = iris.target test_size = int(0.2 * len(y)) np.random.seed(13) indices = np.random.permutation(len(X)) X_train = X[indices[:-test_size]] y_train = y[indices[:-test_size]] X_test = X[indices[-test_size:]] y_test = y[indices[-test_size:]] if __name__ == "__main__": # Instantiate a HyperoptEstimator with the search space and number of evaluations estim = HyperoptEstimator(classifier=any_classifier("my_clf"), preprocessing=any_preprocessing("my_pre"), algo=tpe.suggest, max_evals=100, trial_timeout=120) # Search the hyperparameter space based on the data estim.fit(X_train, y_train) # Show the results print(estim.score(X_test, y_test)) # 1.0 print(estim.best_model()) # {'learner': ExtraTreesClassifier(bootstrap=False, class_weight=None, criterion='gini', # max_depth=3, max_features='log2', max_leaf_nodes=None, # min_impurity_decrease=0.0, min_impurity_split=None, # min_samples_leaf=1, min_samples_split=2, # min_weight_fraction_leaf=0.0, n_estimators=13, n_jobs=1, # oob_score=False, random_state=1, verbose=False, # warm_start=False), 'preprocs': (), 'ex_preprocs': ()} 100%|██████████| 1/1 [00:01<00:00, 1.81s/trial, best loss: 0.33333333333333337] 100%|██████████| 2/2 [00:01<00:00, 1.65s/trial, best loss: 0.04166666666666663] 100%|██████████| 3/3 [00:01<00:00, 1.70s/trial, best loss: 0.04166666666666663] 100%|██████████| 4/4 [00:01<00:00, 1.68s/trial, best loss: 0.04166666666666663] 100%|██████████| 5/5 [00:02<00:00, 2.21s/trial, best loss: 0.04166666666666663] 100%|██████████| 6/6 [00:01<00:00, 1.61s/trial, best loss: 0.04166666666666663] 100%|██████████| 7/7 [00:01<00:00, 1.60s/trial, best loss: 0.04166666666666663] 100%|██████████| 8/8 [00:02<00:00, 2.35s/trial, best loss: 0.04166666666666663] 100%|██████████| 9/9 [00:01<00:00, 1.62s/trial, best loss: 0.04166666666666663] 100%|██████████| 10/10 [00:01<00:00, 1.65s/trial, best loss: 0.04166666666666663] 100%|██████████| 11/11 [00:01<00:00, 1.73s/trial, best loss: 0.04166666666666663] 100%|██████████| 12/12 [00:01<00:00, 1.64s/trial, best loss: 0.04166666666666663] 100%|██████████| 13/13 [00:01<00:00, 1.86s/trial, best loss: 0.04166666666666663] 100%|██████████| 14/14 [00:02<00:00, 2.01s/trial, best loss: 0.04166666666666663] 100%|██████████| 15/15 [00:01<00:00, 1.62s/trial, best loss: 0.04166666666666663] 100%|██████████| 16/16 [00:01<00:00, 1.63s/trial, best loss: 0.04166666666666663] 100%|██████████| 17/17 [00:01<00:00, 1.65s/trial, best loss: 0.04166666666666663] 100%|██████████| 18/18 [00:02<00:00, 2.54s/trial, best loss: 0.04166666666666663] 100%|██████████| 19/19 [00:01<00:00, 1.78s/trial, best loss: 0.04166666666666663] 100%|██████████| 20/20 [00:01<00:00, 1.93s/trial, best loss: 0.04166666666666663] 100%|██████████| 21/21 [00:01<00:00, 1.83s/trial, best loss: 0.04166666666666663] 100%|██████████| 22/22 [00:01<00:00, 1.75s/trial, best loss: 0.04166666666666663] 100%|██████████| 23/23 [00:01<00:00, 1.73s/trial, best loss: 0.04166666666666663] 96%|█████████▌| 23/24 [00:00<?, ?trial/s, best loss=?] job exception: ExponentialLoss requires 2 classes; got 3 class(es) 96%|█████████▌| 23/24 [00:01<?, ?trial/s, best loss=?] --------------------------------------------------------------------------- ValueError Traceback (most recent call last) ValueError: ExponentialLoss requires 2 classes; got 3 class(es)
MNIST give:
from hpsklearn import HyperoptEstimator, extra_tree_classifier from sklearn.datasets import load_digits from hyperopt import tpe import numpy as np # Download the data and split into training and test sets digits = load_digits() X = digits.data y = digits.target test_size = int(0.2 * len(y)) np.random.seed(13) indices = np.random.permutation(len(X)) X_train = X[indices[:-test_size]] y_train = y[indices[:-test_size]] X_test = X[indices[-test_size:]] y_test = y[indices[-test_size:]] if __name__ == "__main__": # Instantiate a HyperoptEstimator with the search space and number of evaluations estim = HyperoptEstimator(classifier=extra_tree_classifier("my_clf"), preprocessing=[], algo=tpe.suggest, max_evals=10, trial_timeout=300) # Search the hyperparameter space based on the data estim.fit(X_train, y_train) # Show the results print(estim.score(X_test, y_test)) # 0.962785714286 print(estim.best_model()) # {'learner': ExtraTreesClassifier(bootstrap=True, class_weight=None, criterion='entropy', # max_depth=None, max_features=0.959202875857, # max_leaf_nodes=None, min_impurity_decrease=0.0, # min_impurity_split=None, min_samples_leaf=1, # min_samples_split=2, min_weight_fraction_leaf=0.0, # n_estimators=20, n_jobs=1, oob_score=False, random_state=3, # verbose=False, warm_start=False), 'preprocs': (), 'ex_preprocs': ()} 0%| | 0/1 [00:00<?, ?trial/s, best loss=?] 100%|██████████| 1/1 [00:01<00:00, 1.64s/trial, best loss: 0.29861111111111116] 100%|██████████| 2/2 [00:01<00:00, 1.60s/trial, best loss: 0.29861111111111116] 100%|██████████| 3/3 [00:01<00:00, 1.64s/trial, best loss: 0.19791666666666663] 100%|██████████| 4/4 [00:01<00:00, 1.65s/trial, best loss: 0.19791666666666663] 100%|██████████| 5/5 [00:01<00:00, 1.64s/trial, best loss: 0.19791666666666663] 100%|██████████| 6/6 [00:01<00:00, 1.70s/trial, best loss: 0.19444444444444442] 100%|██████████| 7/7 [00:01<00:00, 1.67s/trial, best loss: 0.16666666666666663] 100%|██████████| 8/8 [00:01<00:00, 1.58s/trial, best loss: 0.16666666666666663] 100%|██████████| 9/9 [00:01<00:00, 1.53s/trial, best loss: 0.16666666666666663] 100%|██████████| 10/10 [00:01<00:00, 1.51s/trial, best loss: 0.16666666666666663] 0.8662952646239555 {'learner': ExtraTreeClassifier(criterion='entropy', max_features=0.41094293549995287, random_state=2, splitter='best'), 'preprocs': (), 'ex_preprocs': ()}
So did I pressed ctrl + v in wrong posture? How to reproduce the example?
It's my first time using hyperopt-sklearn. But when I copied the two example and run them, iris example give me error and MNIST example give me different result.
I ran it in a new venv with following
the iris one give:
MNIST give:
So did I pressed ctrl + v in wrong posture? How to reproduce the example?