JuliaAI / MLJTuning.jl

Hyperparameter optimization algorithms for use in the MLJ machine learning framework
MIT License
66 stars 12 forks source link

Frameworks for HP optimization #55

Open azev77 opened 4 years ago

azev77 commented 4 years ago

Julia HP optimization packages:

Other HP optimization packages:

There are projects that benchmark different AutoML systems: https://openml.github.io/automlbenchmark/ From our conversation: https://github.com/alan-turing-institute/MLJ.jl/issues/416#issuecomment-640823116 I wanted to tell you guys about Optuna (repo & paper) a new framework for HP optimization. A nice comparison w/ Hyperopt shows what can be done for HP visualization: https://neptune.ai/blog/optuna-vs-hyperopt

Here are a few snips: image

image

A 3 minute clip: https://www.youtube.com/watch?v=-UeC4MR3PHM

It would really be amazing for MLJ to incorporate this!

ghost commented 4 years ago

Hyperopt.py (also Hyperopt.jl)

To clarify, Hyperopt.jl is not related to the Python hyperopt. It uses different optimisation techniques (random search, latin hypercube sampling and Bayesian optimization) and deserves its own position in the list.

However TreeParzen.jl is a direct port of the Python hyperopt to Julia, uses the same optimisation technique (tree-parzen estimators), and has the same behaviour.

vollmersj commented 4 years ago

Maybe it is worth considering bandit frame works Ax

azev77 commented 4 years ago

Thanks. @vollmersj, added. Please let me know if you have other suggestions

casasgomezuribarri commented 3 years ago

Hi, I'm really excited to see a Bayesian optimization method for hyperparameter tuning! I note that RandomSerach() and LatinHypercube() are already possible choices for the tuning = kwarg of TunedModel(), and I see them grouped together with Bayesian opt in the original post of this issue. Is it already possible to implement this method for hyperparameter tuning/is there any notion of whether it will be out soon/how soon? :-)

Thanks a lot for basically everything so far!