SimonBlanke / Hyperactive

An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
https://simonblanke.github.io/hyperactive-documentation
MIT License
512 stars 42 forks source link

New feature: Optimization Strategies #54

Closed SimonBlanke closed 1 year ago

SimonBlanke commented 2 years ago

I would like to introduce a new feature to Hyperactive to chain together multiple optimization algorithms. This will be called an Optimization Strategy in the future.

The API for this feature could look like this:

opt_strat = OptimizationStrategy()
opt_strat.add_optimizer(RandomSearchOptimizer(), duration=0.5)
opt_strat.add_optimizer(HillClimbingOptimizer(), duration=0.5)

hyper = Hyperactive()
hyper.add_search(model, search_space, n_iter=20, optimizer=opt_strat)
hyper.run()

The duration will be the fraction of n_iter passed to add_search(...). Each optimizer will automatically pass the memory to the next one.

This feature-idea is in an early stage and might change in the future.

SimonBlanke commented 1 year ago

While working on this issue I encountered a problem with the progress-bar when running multiple optimization algorithms within hyperactive. Currently the progress-bar is fully controlled by gradient-free-optimizers. So the problem was that (for the example above) one progress-bar would run for 10 iterations and then another one would run again for 10 iterations. This is not how I want this to work. The progress-bar should how the entire search-progress.

The fix for this problem was very complicated, because hyperactive had no access to the iteration-loop but just called .search(...) of the gfo-optimizer. I had to create something like an underlying gfo-api in this commit, by creating init-, step-, and finish stages of the search. This enables hyperactive to loop over the step-method and use this loop to update its own progress-bar. It works perfectly so far and I am very happy with the result.