Closed mohmdelsayed closed 2 years ago
Hi, unfortunately, this is feature does not exist in DeepOBS. Currently, the tuner does indeed use a single seed per explored hyperparameter setting, with multiple seeds used in the best (chosen) hyperparameter. This simulates a practitioner who tunes his parameters once (using a single seed, to get an idea of well-working parameters) but might re-train the model multiple times with small changes.
If you look at the code for the tuner
, you should be able to adapt them and write your own version that runs the same hyperparameters multiple times with different seeds.
Thank you, Frank, for the explanation.
Hi there,
I'm trying to use DeepOBS with GridSearch. I want to do a grid search averaged over a number of seeds. As far as I know, the grid search just uses a single seed for comparison which might be misleading to report the best performing hyper-parameters because of the stochasticity. Does this feature already exist? If so, please let me know how to do it with DeepOBS.
Thank you so much.