Closed ashukid closed 5 years ago
Hello and thank you for your interest in AutoPyTorch.
There are two algorithms used: BOHB and Hyperband. BOHB utilizes Bayesian optimization and the bandit-based Hyperband and is the default for the "small_cs" and "medium_cs" config presets.
For the implementation we use the HpBandSter package.
Hi @ashukid ,
Thank you for asking. Our main reference is the AutoNet chapter as described in our AutoML Book: https://www.automl.org/wp-content/uploads/2019/05/AutoML_Book_Chapter7.pdf
Previously, the name of the project was AutoNet. Since this was too generic, we changed the name to AutoPyTorch. AutoNet 2.0 in the reference mention above is indeed AutoPyTorch.
Best, Marius
@mlindauer I haven't gone through the book, but I will do it very soon. I was wondering how this algorithm is different from the one google is using (NASNet one).
@ashukid I think there is not only one approach Google has published in the last two years. So, there is not the one Google approach. For the original NASNet paper, they used reinforcement learning. As Lucas pointed out, we use a complete different approach, called Bayesian Optimization and combined it with Hyperband (an iterative bandit approach). I would recommend to read Chapter 3 of the AutoML book first, if you are not familiar with the different trends in NAS: https://www.automl.org/wp-content/uploads/2019/05/AutoML_Book_Chapter3.pdf
Best, Marius
I was reading papers on
automl
and architecture search. There are actually lot of ways to do so, one very obvious to train different model directly.I was wondering what this library is using ?