automl / Auto-PyTorch

Automatic architecture search and hyperparameter optimization for PyTorch
Apache License 2.0
2.37k stars 287 forks source link

What algorithm is used by Auto-PyTorch #11

Closed ashukid closed 5 years ago

ashukid commented 5 years ago

I was reading papers on automl and architecture search. There are actually lot of ways to do so, one very obvious to train different model directly.

I was wondering what this library is using ?

LMZimmer commented 5 years ago

Hello and thank you for your interest in AutoPyTorch.

There are two algorithms used: BOHB and Hyperband. BOHB utilizes Bayesian optimization and the bandit-based Hyperband and is the default for the "small_cs" and "medium_cs" config presets.

For the implementation we use the HpBandSter package.

mlindauer commented 5 years ago

Hi @ashukid ,

Thank you for asking. Our main reference is the AutoNet chapter as described in our AutoML Book: https://www.automl.org/wp-content/uploads/2019/05/AutoML_Book_Chapter7.pdf

Previously, the name of the project was AutoNet. Since this was too generic, we changed the name to AutoPyTorch. AutoNet 2.0 in the reference mention above is indeed AutoPyTorch.

Best, Marius

ashukid commented 5 years ago

@mlindauer I haven't gone through the book, but I will do it very soon. I was wondering how this algorithm is different from the one google is using (NASNet one).

mlindauer commented 5 years ago

@ashukid I think there is not only one approach Google has published in the last two years. So, there is not the one Google approach. For the original NASNet paper, they used reinforcement learning. As Lucas pointed out, we use a complete different approach, called Bayesian Optimization and combined it with Hyperband (an iterative bandit approach). I would recommend to read Chapter 3 of the AutoML book first, if you are not familiar with the different trends in NAS: https://www.automl.org/wp-content/uploads/2019/05/AutoML_Book_Chapter3.pdf

Best, Marius