howardyclo / papernotes

My personal notes and surveys on DL, CV and NLP papers.
128 stars 6 forks source link

Neural Architecture Search #55

Open howardyclo opened 5 years ago

howardyclo commented 5 years ago

Neural Architecture Search TL;DR

This TL;DR aims to help myself getting a quick view of neural architecture search literature. Hopes it helps you too 😁!

Designing neural network for task of interest often requires significant architecture engineering. Neural Architecture Search (NAS/AutoML) aims to automatically find a neural network with an optimal architecture such that it can achieve good or even state-of-the-art performance on the task of interest. Most of the NAS work focus on designing the search space (what activation functions or cell operations to search on). Two main categories of NAS: Evolutionary (ES) algorithms and Reinforcment learning (RL) algorithms. Here we mainly overview RL-based methods.

[1] Neural Architecture Search with Reinforcement Learning (NAS). ICLR 2017.

[2] Learning Transferable Architectures for Scalable Image Recognition (NASNet). CVPR 2018.

[3] Efficient Architecture Search by Network Transformation (EAS). AAAI 2018. [Code]

[4] Progressive Neural Architecture Search (PNAS). ECCV 2018. [TF Code][PT Code]

[5] Efficient Neural Architecture Search via Parameter Sharing (ENAS). ICML 2018.

[6] Understanding and Simplifying One-Shot Architecture Search (One-Shot). ICML 2018.

[7] DARTS: Differentiable Architecture Search. ICLR 2019.

[8] InstaNAS: Instance-aware Neural Architecture Search. AAAI 2020 | ICML 2019 Workshop. [Website][Code]

[9] ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware. ICLR 2019. [Website] [Poster] [Code]

[10] Once-For-All: Train One Network and Specialize It for Efficient Deployment. ICLR 2020.

For Latest State of NAS: See A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions