maxpumperla / hyperas

Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
http://maxpumperla.com/hyperas/
MIT License
2.17k stars 316 forks source link

Continue optimization #217

Open cyberbob4 opened 5 years ago

cyberbob4 commented 5 years ago

I want to continue optimization with additional runs. It seems that hyperopt can do this by setting the random seed and saving Trials. Ref: https://github.com/hyperopt/hyperopt/issues/267

To replicate the same workflow I set the random seed and then run my script with 1 evaluation. I then saved Trials to a pickle file. For next iteration I increased the number of runs to two and read the pickle file to initialize Trials. Previous run is read by hyperopt as it shows "using 1/1 trials" in the beginning instead of 0/0. Also it just does an additional run and stops. Unfortunately the run is not a different one. As a result I end up with two exactly same runs where it supposed to be a different run. Is this a bug or am I doing something wrong.

I am adding the code below which is for the second run. For the first one " trials = pickle.load(open("myfile.p", "rb"))" is commented instead of "trials = Trials()" and max_evals is set to 1.

if __name__ == '__main__':
     # trials = Trials()
    trials = pickle.load(open("myfile.p", "rb"))
    best_run, best_model = optim.minimize(model=create_model,
                                          data=data,
                                          algo=tpe.suggest,
                                          max_evals=2,
                                          rseed=2019,
                                          trials=trials)
    trainX, testX, trainy, tes_y = data()
    print("Evalutation of best performing model:")
    print(best_model.evaluate(testX, testy))
    print("Best performing model chosen hyper-parameters:")
    print(best_run)
    pickle.dump(trials, open("myfile.p", "wb"))
maxpumperla commented 5 years ago

@cyberbob4 thanks for your observation. I need to further investigate this. On the first look I'm not sure how this is possible, since we pass-through trials down to hyperopt's fmin eventually, see here:

https://github.com/maxpumperla/hyperas/blob/master/hyperas/optim.py#L124

LucaUrbinati44 commented 4 years ago

Read my questions and answers in this topic. In short, I changed all my code to Hyperopt (thanks to this examples).