Self-Supervised Representation Learning for Evolutionary Neural Architecture Search ©SHUTTERSTOCK.COM/YURCHANKA SIARHEI Chen Wei Xidian University and Xi'an University of Posts & Telecommunications, CHINA Yiping Tang Xidian University, CHINA Chuang Niu Rensselaer Polytechnic Institute, USA Haihong Hu, Yue Wang, and Jimin Liang Xidian University, CHINA Digital Object Identifier 10.1109/MCI.2021.3084415 Date of current version: 15 July 2021 1556-603X/21©2021IEEE Abstract-Recently proposed neural architecture search (NAS) algorithms adopt neural predictors to accelerate architecture search. The capability of neural predictors to accurately predict the performance metrics of the neural architecture is critical to NAS, but obtaining training datasets for neural predictors is often time-consuming. How to obtain a neural predictor with high prediction accuracy using a small amount of training data is a central problem to neural predictor-based NAS. Here, a new architecture encoding scheme is first devised to calculate the graph edit distance of neural architectures, which overcomes the drawbacks of existing vector-based architecture encoding schemes. To enhance the predictive performance of neural Corresponding Author: Jimin Liang (email: jimleung@mail.xidian.edu.cn). AUGUST 2021 | IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE 33http://www.SHUTTERSTOCK.COM/YURCHANKA