Our seven 2019 papers on neural architecture search (NAS)

Neural Architecture Search (NAS) is a very hot topic in AutoML these days, and our group is very actively publishing in this area. We have seven NAS papers in 2019, which may make us one of the world’s most active groups in NAS (only closely surpassed by a small company called Google ;-). Here are the papers and quick blog posts about them:

    • Neural Architecture Search: A Survey: this is the first survey of the modern NAS literature. We survey the field and categorize research into three dimensions: search space, search strategy, and performance estimation strategy. The paper was published by JMLR in 2019, and we’ll update the arXiv version regularly.
    • Efficient Multi-objective NAS via Network Morphisms: in this ICLR 2019 paper, with our collaborators at the Bosch Center for Artificial Intelligence, we introduce a very natural and efficient multi-objective NAS approach. You care about size, test-time speed, performance on different benchmarks? No problem, this is truly multi-objective (no scalarization of multiple objectives as many other works).
    • Learning to Design RNA (aka: AutoRL based on joint NAS & HPO): in this ICLR 2019 paper, we tackle an application problem (RNA design), but to do so we use NAS to optimize the neural architecture of deep RL algorithms. We also jointly optimize other hyperparameters and parameters of the MDP formulation, yielding a hands-off version of PPO that paves the way towards Auto-RL!
    • NAS-Bench-101: in this ICML 2019 paper, together with our collaborators at Google Brain, we introduce the first benchmark for NAS. We exhaustively evaluated a small cell space, now allowing anyone to run NAS experiments on this space in minutes on their laptop, facilitating reproducibility, comparability, and scientific rigor in NAS research!
    • AutoDispNet: Improving Disparity Estimation with AutoML: in this ICCV 2019 paper, we show how to optimize U-Net like architectures for dense regression and optimize their hyperparameters with BOHB, yielding a new state-of-the-art for disparity estimation. (Link to paper; blog post to come soon)
    • A new paper & blog post on gradient-based NAS (sneak preview; online next week)
    • A new paper & blog post on best practices in NAS (sneak preview; online next week)

Leave A Reply

Your email address will not be published. Required fields are marked *