I would like to be able to "extract"/import the hyperparameter optimization classes for other projects that do not rely on clusters or want to use hyperopt in the "inner loop". E.g. I had to re-write SMBO for the CCN algonauts challenge and would have loved to import it from the mle-toolbox. As of right now RandomHyperoptimisation is somewhat entangled with the HyperOptLogger and information specific to the arguments for a single job (resources).
I propose to refactor all hyperoptimisation classes and to use them as wrappers around the search class and the MLE-specific ingredients. E.g. as in
RandomSearch would have the following methods: ask, tell, reload, refine, store and internally stores the counter and previously evaluated parameters. In this way, we can simply do from mle_toolbox.hyperopt import RandomSearch if we want to use the search method without the toolbox.
[x] Rewrite search API
[x] get_hyperparam_proposal -> ask and clean_up_after_batch_iteration -> tell
[x] Search classes themselves need to store search data. Not only HyperOptLogger
[ ] Fix reloading
[x] Random Search
[x] Implement boundary refinement based on top-k
[x] Check that there are no duplicates in random search batch! Sampling w/o replacement
[x] Grid Search
[x] SMBO
[x] Nevergrad
[x] Easy storage and reloading of .pkl log of search results
Note: Potentially think about refactoring into a sub-package mle-search or mle-hyperopt.
I would like to be able to "extract"/import the hyperparameter optimization classes for other projects that do not rely on clusters or want to use hyperopt in the "inner loop". E.g. I had to re-write SMBO for the CCN algonauts challenge and would have loved to import it from the
mle-toolbox
. As of right nowRandomHyperoptimisation
is somewhat entangled with theHyperOptLogger
and information specific to the arguments for a single job (resources).I propose to refactor all hyperoptimisation classes and to use them as wrappers around the search class and the MLE-specific ingredients. E.g. as in
RandomSearch
would have the following methods:ask
,tell
,reload
,refine
,store
and internally stores the counter and previously evaluated parameters. In this way, we can simply dofrom mle_toolbox.hyperopt import RandomSearch
if we want to use the search method without the toolbox.get_hyperparam_proposal
->ask
andclean_up_after_batch_iteration
->tell
HyperOptLogger
.pkl
log of search resultsNote: Potentially think about refactoring into a sub-package
mle-search
ormle-hyperopt
.