automl / labwatch

An extension to Sacred for automated hyperparameter optimization.
59 stars 19 forks source link

Vision for this project? #13

Open rueberger opened 6 years ago

rueberger commented 6 years ago

I was delighted to stumble upon this project just now, in Klauss' 2017 Scipy sacred write-up.

I feel that there is great potential and need for such a project.

sacred has revolutionized ML development for us, and we are rooting for it become a standard component of the modern ML ecosystem.

In much the same way that sacred cracked down on the massive headache of ensuring reproducibility during model development by providing high-level easy to use tools, there seems to be a huge opportunity to tame the wild west of hyperparam tuning by building high-level tools as a community.

Building on top of sacred for this makes perfect sense. For us, the only way we support to run experiments is through sacred, so if we were to plug into a hyperopt library (eg spearmint), points searched in hyperparam space would be represented as sacred experiments anyway.

Additionally, afaik there are no major FOSS hyperparam projects right now. Spearmint is commonly used in my experience, but not free for commercial use. About 6 months ago I did a fairly extensive survey of available options and all I found was hyperopt which is okay, but can be a pain to use[1] and lacks Bayesian algos. More recently, I stumbled upon the ray tune framework, which I haven't had a chance to investigate deeply but looks very promising.

I'm curious about the vision of the original authors for this project?

From where I'm sitting, (admittedly having not looked deeply into the current design of labwatch), I feel that a keras level tool would be appropriate - high-level glue between hyperopt libaries and sacred, focused on providing a clean interface but not necessarily implementing the underlying hyperopt algos.

[1] Referring mostly to its parallelization features, not trying to knock hyperopt; I found it to be the best option in my recent survey, and it's pretty easy to use for simple searches.

Qwlouse commented 6 years ago

Hey @rueberger! Happy to see you more and more often :-) But I am afraid this project is currently in a beta state at best and unmaintained. AFAICT The original authors have abandoned it. And I currently don't have the capacities to pick it up. So the sad truth is that this really useful project is rather dead.

My original vision for this project was to provide an easy to use interface of sacred with different hyperparameter optimizers such as spearmint, hyperopt, RoBO, scikit optimize, etc. The idea was to have a simple language to define search-spaces, that could be used akin to sacred named_configs. Whenever that search-space would be used the library would then handle the conversion to the search-space of the respective language, take care of running the optimizer, getting a suggestion to run, and use them as config updates in sacred to start the experiment. All of this was meant to provide a convenient standardized interface to different optimizer backends, to reduce the hurdle and friction of using hyperparameter optimizers.