IEEE Computational Intelligence Magazine - August 2021 - 23

a student-teacher framework. By eliminating the inferior or
least promising operations, the evolutionary process is greatly
accelerated. Experimental results show that we achieve state-ofthe-art
performance for downstream applications, such as object
recognition, object detection, and instance segmentation.
I. Introduction
L
earning high-level representations from labeled data and
deep learning models in an end-to-end manner is one of
the biggest successes in computer vision in recent history.
These techniques make manually specified features largely
redundant and have greatly improved the state-of-the-art for many
real-world applications, such as medical imaging, security, and
autonomous driving. However, there are still many challenges. For
example, a learned representation from supervised learning for
image classification may lack information such as texture, which
matters little for classification but can be more relevant for later
tasks. Yet adding it makes the representation less general, and it
might be irrelevant for tasks such as image captioning. Thus,
improving representation learning requires features to be focused
on solving a specific task. Unsupervised learning is an important
stepping stone towards robust and generic representation learning
[1]. The main challenge is the significant performance gap compared
to supervised learning. Neural architecture search (NAS) has
shown extraordinary potential in deep learning due to the customization
of the network for target data and tasks. Intuitively,
using the target data and searching the tasks directly without a
proxy gap will result in the least domain bias. For performance reasons
however, early NAS [2] methods only searched small datasets.
Later, transfer methods were utilized to search for an optimal
architecture on one dataset, then adapt the architecture to work on
larger datasets [3]. ProxylessNAS [4] was proposed to search larger
datasets after directly learning the architecture for the target task
and hardware, instead of using a proxy. Recent advances in NAS
show a surge of interest in neural architecture evolution (NAE)
[3], [5]. NAE first searches for an appropriate architecture on a
small dataset and then transfers the architecture to larger dataset, by
simply adjusting weights. One key to the success of NAS or NAE
is the use of large-scale data, suggesting that more data leads to better
performance. The problem, however, is that the cost of labeling
these larger datasets may become prohibitive as their size grows.
In scenarios where we can not obtain sufficient annotations,
self-supervised learning is a popular approach to leverage the
mutual information of unlabeled training data. However, the performance
of unsupervised methods is still unsatisfactory compared
with the supervised methods. One obstacle is that only the parameters
are learned using conventional self-supervised methods. To
address the performance gap, a natural idea is to explore the use of
NAE to optimize the architecture along with parameter training.
Tackling these bottlenecks will allow us to design and implement
efficient deep learning systems that will help to address a variety of
practical applications. Specifically, we can initialize with an architecture
found using NAS on a small supervised dataset and then
evolve the architecture on a larger dataset using unsupervised
learning. Currently, existing architecture evolution methods [3],
[5] are inefficient and can not deal effectively with unsupervised
representation learning. Our approach is extremely efficient with
a complexity of ()
On2
where n is the size of the operation space.
Here we propose our Fast and Unsupervised Neural Architecture
Evolution (FaUNAE) method to search architectures for representation
learning. Although UnNAS [6] discusses the value of a
label and discovers that labels are not necessary for NAS, it cannot
solve the aforementioned problems because it is computationally
expensive and is trained using supervised learning for real applications.
FaUNAE is introduced to evolve an architecture from an
existing architecture manually designed or searched from one
small-scale dataset on another large-scale dataset. This partial optimization
can utilize the existing models to reduce the search cost
and improve search efficiency. The strategy is more practical for
real applications, as it can efficiently adapt to the new scenarios'
minimal data labeling requirements.
First, we adopt a trial-and-test method to evolve the initial
architecture, which is more efficient than the traditional evolution
methods, which are computationally expensive and require large
amounts of labeled data. Second, we note that the quality of the
architecture is hard to estimate due to the absence of labeled data.
To address this, we explore contrastive loss [1] as the evaluation
metric for the operation evaluation. Although our method is built
based on contrastive loss [1], we model our method on the teacher-student
framework to mimic supervised learning and then
estimate the operation performance even without annotations.
Then the architecture can be evolved based on the estimated performance.
Third, we address the fact that one bottleneck in NAS
is its explosive search space of up to 148. The search space issue is
even more challenging for unsupervised NAS built on an ambiguous
performance estimation that further deteriorates the training
process. To address this issue, we build our search algorithm based
on the principles of survival of the fittest and elimination of the
inferior. This significantly improves search efficiency. Our framework
is shown in Fig. 1. Our contributions are as follows:
❏ We propose unsupervised neural architecture evolution,
which utilizes prior knowledge to search for an architecture
using large-scale unlabeled data.
❏ We design a new search block and limit the model size
according to the initial architecture during evolution to deal
with the huge search space of ResNet. The search space is
further reduced through a contrastive loss in a teacher-student
framework by abandoning operations with less potential
to significantly improve the search efficiency.
❏ Extensive experiments demonstrate that the proposed
methods achieve better performance than prior art on ImageNet,
PASCAL VOC, and COCO.
II. Related Work
A. Unsupervised Learning
Recent progress on unsupervised/self-supervised1 learning
began with artificially designed pretext tasks, such as patch
1 Self-supervised learning is a form of unsupervised learning.
AUGUST 2021 | IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE 23

IEEE Computational Intelligence Magazine - August 2021

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - August 2021

Contents
IEEE Computational Intelligence Magazine - August 2021 - Cover1
IEEE Computational Intelligence Magazine - August 2021 - Cover2
IEEE Computational Intelligence Magazine - August 2021 - Contents
IEEE Computational Intelligence Magazine - August 2021 - 2
IEEE Computational Intelligence Magazine - August 2021 - 3
IEEE Computational Intelligence Magazine - August 2021 - 4
IEEE Computational Intelligence Magazine - August 2021 - 5
IEEE Computational Intelligence Magazine - August 2021 - 6
IEEE Computational Intelligence Magazine - August 2021 - 7
IEEE Computational Intelligence Magazine - August 2021 - 8
IEEE Computational Intelligence Magazine - August 2021 - 9
IEEE Computational Intelligence Magazine - August 2021 - 10
IEEE Computational Intelligence Magazine - August 2021 - 11
IEEE Computational Intelligence Magazine - August 2021 - 12
IEEE Computational Intelligence Magazine - August 2021 - 13
IEEE Computational Intelligence Magazine - August 2021 - 14
IEEE Computational Intelligence Magazine - August 2021 - 15
IEEE Computational Intelligence Magazine - August 2021 - 16
IEEE Computational Intelligence Magazine - August 2021 - 17
IEEE Computational Intelligence Magazine - August 2021 - 18
IEEE Computational Intelligence Magazine - August 2021 - 19
IEEE Computational Intelligence Magazine - August 2021 - 20
IEEE Computational Intelligence Magazine - August 2021 - 21
IEEE Computational Intelligence Magazine - August 2021 - 22
IEEE Computational Intelligence Magazine - August 2021 - 23
IEEE Computational Intelligence Magazine - August 2021 - 24
IEEE Computational Intelligence Magazine - August 2021 - 25
IEEE Computational Intelligence Magazine - August 2021 - 26
IEEE Computational Intelligence Magazine - August 2021 - 27
IEEE Computational Intelligence Magazine - August 2021 - 28
IEEE Computational Intelligence Magazine - August 2021 - 29
IEEE Computational Intelligence Magazine - August 2021 - 30
IEEE Computational Intelligence Magazine - August 2021 - 31
IEEE Computational Intelligence Magazine - August 2021 - 32
IEEE Computational Intelligence Magazine - August 2021 - 33
IEEE Computational Intelligence Magazine - August 2021 - 34
IEEE Computational Intelligence Magazine - August 2021 - 35
IEEE Computational Intelligence Magazine - August 2021 - 36
IEEE Computational Intelligence Magazine - August 2021 - 37
IEEE Computational Intelligence Magazine - August 2021 - 38
IEEE Computational Intelligence Magazine - August 2021 - 39
IEEE Computational Intelligence Magazine - August 2021 - 40
IEEE Computational Intelligence Magazine - August 2021 - 41
IEEE Computational Intelligence Magazine - August 2021 - 42
IEEE Computational Intelligence Magazine - August 2021 - 43
IEEE Computational Intelligence Magazine - August 2021 - 44
IEEE Computational Intelligence Magazine - August 2021 - 45
IEEE Computational Intelligence Magazine - August 2021 - 46
IEEE Computational Intelligence Magazine - August 2021 - 47
IEEE Computational Intelligence Magazine - August 2021 - 48
IEEE Computational Intelligence Magazine - August 2021 - 49
IEEE Computational Intelligence Magazine - August 2021 - 50
IEEE Computational Intelligence Magazine - August 2021 - 51
IEEE Computational Intelligence Magazine - August 2021 - 52
IEEE Computational Intelligence Magazine - August 2021 - 53
IEEE Computational Intelligence Magazine - August 2021 - 54
IEEE Computational Intelligence Magazine - August 2021 - 55
IEEE Computational Intelligence Magazine - August 2021 - 56
IEEE Computational Intelligence Magazine - August 2021 - 57
IEEE Computational Intelligence Magazine - August 2021 - 58
IEEE Computational Intelligence Magazine - August 2021 - 59
IEEE Computational Intelligence Magazine - August 2021 - 60
IEEE Computational Intelligence Magazine - August 2021 - 61
IEEE Computational Intelligence Magazine - August 2021 - 62
IEEE Computational Intelligence Magazine - August 2021 - 63
IEEE Computational Intelligence Magazine - August 2021 - 64
IEEE Computational Intelligence Magazine - August 2021 - 65
IEEE Computational Intelligence Magazine - August 2021 - 66
IEEE Computational Intelligence Magazine - August 2021 - 67
IEEE Computational Intelligence Magazine - August 2021 - 68
IEEE Computational Intelligence Magazine - August 2021 - 69
IEEE Computational Intelligence Magazine - August 2021 - 70
IEEE Computational Intelligence Magazine - August 2021 - 71
IEEE Computational Intelligence Magazine - August 2021 - 72
IEEE Computational Intelligence Magazine - August 2021 - 73
IEEE Computational Intelligence Magazine - August 2021 - 74
IEEE Computational Intelligence Magazine - August 2021 - 75
IEEE Computational Intelligence Magazine - August 2021 - 76
IEEE Computational Intelligence Magazine - August 2021 - 77
IEEE Computational Intelligence Magazine - August 2021 - 78
IEEE Computational Intelligence Magazine - August 2021 - 79
IEEE Computational Intelligence Magazine - August 2021 - 80
IEEE Computational Intelligence Magazine - August 2021 - 81
IEEE Computational Intelligence Magazine - August 2021 - 82
IEEE Computational Intelligence Magazine - August 2021 - 83
IEEE Computational Intelligence Magazine - August 2021 - 84
IEEE Computational Intelligence Magazine - August 2021 - 85
IEEE Computational Intelligence Magazine - August 2021 - 86
IEEE Computational Intelligence Magazine - August 2021 - 87
IEEE Computational Intelligence Magazine - August 2021 - 88
IEEE Computational Intelligence Magazine - August 2021 - 89
IEEE Computational Intelligence Magazine - August 2021 - 90
IEEE Computational Intelligence Magazine - August 2021 - 91
IEEE Computational Intelligence Magazine - August 2021 - 92
IEEE Computational Intelligence Magazine - August 2021 - 93
IEEE Computational Intelligence Magazine - August 2021 - 94
IEEE Computational Intelligence Magazine - August 2021 - 95
IEEE Computational Intelligence Magazine - August 2021 - 96
IEEE Computational Intelligence Magazine - August 2021 - Cover3
IEEE Computational Intelligence Magazine - August 2021 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com