【Paper Reading】Single Path One-Shot Neural Architecture Search with Uniform Sampling

论文阅读 NAS相关

Single Path One-Shot Neural Architecture Search with Uniform Sampling

作者

Zichao Guo∗, Xiangyu Zhang∗, Haoyuan Mu, Wen Heng, Zechun Liu, Yichen Wei, Jian Sun

单位:

旷视科技

清华大学

香港科技大学

摘要:

One-shot method [2] is a powerful Neural Architecture Search (NAS) framework, but its training is non-trivial and it is difficult to achieve competitive results on large scale datasets like ImageNet. In this work, we propose a Single Path One-Shot model to address its main challenge in the training. Our central idea is to construct a simplified supernet, Single Path Supernet, which is trained by an uniform path sampling method. All underlying architectures (and their weights) get trained fully and equally. Once we have a trained supernet, we apply an evolutionary algorithm to efficiently search the best-performing architectures without any fine tuning. Comprehensive experiments verify that our approach is flexible and effective. It is easy to train and fast to search. It effortlessly supports complex search spaces (e.g., building blocks, channel, mixed-precision quantization) and different search constraints (e.g., FLOPs, latency). It is thus convenient to use for various needs. It achieves start-of-the-art performance on the large dataset ImageNet.

创新点:

1、In this work, we propose a SinglePath One-Shot model to address its main challenge in the training.

2、Once we have a trained supernet, we apply an evolutionary algorithm to efficiently search the best-performing architectures without any fine tuning.

3、Comprehensive experiments verify that our approach is flexible and effective. It is easy to train and fast to search.

4、It achieves start-of-the-art performance on the large dataset ImageNet.

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
【社区内容提示】社区部分内容疑似由AI辅助生成,浏览时请结合常识与多方信息审慎甄别。
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

相关阅读更多精彩内容

友情链接更多精彩内容