TY - GEN
T1 - EDNAS: An Efficient Neural Architecture Design based on Distribution Estimation
AU - Zhao, Zhenyao
AU - Zhang, Guang En
AU - Jiang, Min
AU - Feng, Liang
AU - Tan, Kay Chen
N1 - Funding Information:
ACKNOWLEDGMENT This work was supported by the National Natural Science Foundation of China (No.61673328, No. 61876162), the Shenzhen Scientific Research and Development Funding Program (No. JCYJ20180307123637294), and the Research Grants Council of the Hong Kong SAR (No. CityU11209219).
Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/23
Y1 - 2020/10/23
N2 - Neural architecture search (NAS) is the process of automatically searching for the best performing neural model on a given task. Designing a neural model requires a lot of time for experts, NAS's automated process effectively solves this problem and makes neural networks easier to promote. Although NAS has achieved excellent performance, its search process is still very time consuming. In this paper, we propose a neural architecture design method based on distribution estimation method called EDNAS, a fast and economical solution to design neural architecture automatically. In EDNAS, we assume that the best performing architecture obeys a certain probability distribution in search space. Therefore, NAS can be transformed to learning this probability distribution. We construct a probability model on the search space, and search for this probability distribution by iterating the probability model. Finally, an architecture that maximizes the performance on a validation set is generated from this probability distribution. Experiment shows the efficiency of our method. On CIFAR-10 dataset, EDNAS discovers a novel architecture in just 4 hours with 2.89% test error, which shows efficent and strong performance.
AB - Neural architecture search (NAS) is the process of automatically searching for the best performing neural model on a given task. Designing a neural model requires a lot of time for experts, NAS's automated process effectively solves this problem and makes neural networks easier to promote. Although NAS has achieved excellent performance, its search process is still very time consuming. In this paper, we propose a neural architecture design method based on distribution estimation method called EDNAS, a fast and economical solution to design neural architecture automatically. In EDNAS, we assume that the best performing architecture obeys a certain probability distribution in search space. Therefore, NAS can be transformed to learning this probability distribution. We construct a probability model on the search space, and search for this probability distribution by iterating the probability model. Finally, an architecture that maximizes the performance on a validation set is generated from this probability distribution. Experiment shows the efficiency of our method. On CIFAR-10 dataset, EDNAS discovers a novel architecture in just 4 hours with 2.89% test error, which shows efficent and strong performance.
UR - http://www.scopus.com/inward/record.url?scp=85098496132&partnerID=8YFLogxK
U2 - 10.1109/IAI50351.2020.9262190
DO - 10.1109/IAI50351.2020.9262190
M3 - Conference article published in proceeding or book
AN - SCOPUS:85098496132
T3 - 2nd International Conference on Industrial Artificial Intelligence, IAI 2020
SP - 1
EP - 6
BT - 2nd International Conference on Industrial Artificial Intelligence, IAI 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2nd International Conference on Industrial Artificial Intelligence, IAI 2020
Y2 - 23 October 2020 through 25 October 2020
ER -