IEEE Computational Intelligence Magazine - August 2021 - 68

I. Introduction
D
eep learning has achieved great success in many
fields [1], such as speech recognition [2], semantic
segmentation [3], [4], image recognition [5], [6], and
natural language processing [7]. With the excellent
performance in these fields, convolutional neural networks
(CNNs) have become one of the most widely used models in
deep learning [8]. In general, conventional CNNs consist of
convolutional layers, pooling layers, and fully-connected layers.
Typical CNNs include AlexNet [9], VGG [10], GoogleNet
[11], ResNet [5], and DenseNet [12]. Although these networks
have achieved huge accuracy improvements in vision tasks, the
design of the CNN architectures is still a difficult task because
of the large number of parameters [13]. These models are all
hand-crafted and they cannot learn the architecture by themselves,
thus, designers need to have much expert knowledge in
CNN architecture design [14]. Moreover, the design of CNN
architecture is guided by problems; that is, the architectures of
CNNs are decided by different data distributions, and a manually
designed architecture lacks flexibility. The two main factors
that affect the performance of CNNs are architectures
and weights. For weight optimization, the gradient descent
algorithm has been proved to offer a significant advantage [15].
Unfortunately, for the optimization of CNNs' architectures,
there are no explicit functions to directly calculate the optimal
architecture [16]. To solve the problems above, Google
researchers Zoph et al. [17] proposed the concept of the neural
architecture search (NAS) for deep neural networks. Following
this concept, a large number of researchers have paid attention
to this field, and it has become one of the most popular
research topics in the automatic machine learning community.
The purpose of NAS is to automatically search for parameters
such as the optimal architectures of CNNs so that CNNs can
outperform those that are hand-crafted. Moreover, the NAS
can reduce the high cost of trial-and-error design.
The existing NAS algorithms can be categorized into three
types: 1) reinforcement learning based methods [18]; 2) gradient
based methods [19]; 3) evolutionary computation based [20]
(ENAS) methods. Among these methods, the NAS method
based on reinforcement learning requires a large number of
computing resources, i.e., thousands of graphics processing units
(GPUs) run on a medium-sized dataset such as CIFAR10 for
more than 10 days. Zoph et al. [17] proposed a NAS method
based on reinforcement learning. They used 800 GPUs and
spent 28 days completing the entire search process, which is not
practical for most interested users. The gradient-based method is
more efficient than the reinforcement learning method, and one
of the most famous works is proposed by Liu et al. [19]. Because
of the lack of theoretical support, the architectures searched by
them are inadequate and unstable, and the process of constructing
a cell consumes significant computing resources and requires
much prior knowledge. The ENAS algorithms use the evolutionary
computation (EC) technique to design the CNN architecture.
In particular, the EC is a population-based technique to
obtain the global optimum. Many EC techniques have been
68 IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | AUGUST 2021
proposed, such as genetic algorithm (GA) [21], particle swarm
optimization (PSO) [22], and artificial ant colony algorithm
[23]. Due to the gradient-free characteristic, the EC technique
has been widely used in optimization problems [24] and even in
black box problems that do not have mathematical equations
explicitly expressed. Therefore, EC techniques have been used
to solve NAS problems.
In fact, two decades ago, the EC technique was already used to
optimize neural networks and it was called neuroevolution [25].
The goal is to use the EC technique to optimize the architecture
and weights of neural networks. Neuroevolution is proposed only
for small or medium scale neural networks [16]. With the development
of neural networks, the CNNs based on deep neural networks
(DNNs) have been proposed and widely used in computer
vision tasks. However, the CNNs have a large number of parameters.
Therefore, to solve the problem of designing CNNs' architectures,
the concept of ENAS was proposed in the evolutionary
computation community. According to references, the first ENAS
work was completed by Google's Real et al. [26] and they proposed
a LargeEvo algorithm. The LargeEvo algorithm uses the
GA to design CNNs' architectures. However, they only use mutation
operators. Experimental results have proven the effectiveness
of LargeEvo on CIFAR10 and CIFAR100 [27]. LargeEvo directly
evolves basic units in open space, then samples and evaluates
each candidate solution, which consumes a lot of computational
resources. Since Real et al. [26] used layers as the search space, this
space contains a relatively simple architecture, LargeEvo initializes
the model with the basic layers, and each individual contains only
a fully connected layer. Xie et al. [28] searched through the same
space, and their results indicated that this trivial search space is
capable of evolving a competitive architecture. Yet, the large search
space brings high computational cost. To overcome this phenomenon,
some methods have been proposed to constrain the search
space, such as block-based design methods. These methods combine
the state-of-the-art architecture in the initial state, so it can
greatly reduce the search space and maintain the performance.
Many block-based methods have been proposed, such as ResNet
blocks [5], DenseNet blocks [12], Inception blocks [11]. Blockbased
methods are quite promising because of the fewer parameters
which greatly speed up the search process.
Some researchers directly used aforementioned blocks to
design CNN architectures [29], [30], whereas others have proposed
different kinds of blocks. For example, Chen et al. [31]
proposed eight kinds of blocks, including ResNet block and
Inception block, and encoded them into 3-bit strings. After that,
they used the Hamming distance to identify blocks that have
similar performance. Song et al. [32] proposed three types of
residual dense blocks, which greatly constrain the search space
and reduce computational consumption. After determining the
search space, different evolution operators were employed to
generate new architectures. Sun et al. [14] used polynomial
mutation on the encoded information which is represented by
real numbers. To make mutations less random, Lorenzo et al.
[33] proposed a new Gaussian mutation guided by Gaussian
regression. Gaussian regression can predict the architectures

IEEE Computational Intelligence Magazine - August 2021

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - August 2021

Contents
IEEE Computational Intelligence Magazine - August 2021 - Cover1
IEEE Computational Intelligence Magazine - August 2021 - Cover2
IEEE Computational Intelligence Magazine - August 2021 - Contents
IEEE Computational Intelligence Magazine - August 2021 - 2
IEEE Computational Intelligence Magazine - August 2021 - 3
IEEE Computational Intelligence Magazine - August 2021 - 4
IEEE Computational Intelligence Magazine - August 2021 - 5
IEEE Computational Intelligence Magazine - August 2021 - 6
IEEE Computational Intelligence Magazine - August 2021 - 7
IEEE Computational Intelligence Magazine - August 2021 - 8
IEEE Computational Intelligence Magazine - August 2021 - 9
IEEE Computational Intelligence Magazine - August 2021 - 10
IEEE Computational Intelligence Magazine - August 2021 - 11
IEEE Computational Intelligence Magazine - August 2021 - 12
IEEE Computational Intelligence Magazine - August 2021 - 13
IEEE Computational Intelligence Magazine - August 2021 - 14
IEEE Computational Intelligence Magazine - August 2021 - 15
IEEE Computational Intelligence Magazine - August 2021 - 16
IEEE Computational Intelligence Magazine - August 2021 - 17
IEEE Computational Intelligence Magazine - August 2021 - 18
IEEE Computational Intelligence Magazine - August 2021 - 19
IEEE Computational Intelligence Magazine - August 2021 - 20
IEEE Computational Intelligence Magazine - August 2021 - 21
IEEE Computational Intelligence Magazine - August 2021 - 22
IEEE Computational Intelligence Magazine - August 2021 - 23
IEEE Computational Intelligence Magazine - August 2021 - 24
IEEE Computational Intelligence Magazine - August 2021 - 25
IEEE Computational Intelligence Magazine - August 2021 - 26
IEEE Computational Intelligence Magazine - August 2021 - 27
IEEE Computational Intelligence Magazine - August 2021 - 28
IEEE Computational Intelligence Magazine - August 2021 - 29
IEEE Computational Intelligence Magazine - August 2021 - 30
IEEE Computational Intelligence Magazine - August 2021 - 31
IEEE Computational Intelligence Magazine - August 2021 - 32
IEEE Computational Intelligence Magazine - August 2021 - 33
IEEE Computational Intelligence Magazine - August 2021 - 34
IEEE Computational Intelligence Magazine - August 2021 - 35
IEEE Computational Intelligence Magazine - August 2021 - 36
IEEE Computational Intelligence Magazine - August 2021 - 37
IEEE Computational Intelligence Magazine - August 2021 - 38
IEEE Computational Intelligence Magazine - August 2021 - 39
IEEE Computational Intelligence Magazine - August 2021 - 40
IEEE Computational Intelligence Magazine - August 2021 - 41
IEEE Computational Intelligence Magazine - August 2021 - 42
IEEE Computational Intelligence Magazine - August 2021 - 43
IEEE Computational Intelligence Magazine - August 2021 - 44
IEEE Computational Intelligence Magazine - August 2021 - 45
IEEE Computational Intelligence Magazine - August 2021 - 46
IEEE Computational Intelligence Magazine - August 2021 - 47
IEEE Computational Intelligence Magazine - August 2021 - 48
IEEE Computational Intelligence Magazine - August 2021 - 49
IEEE Computational Intelligence Magazine - August 2021 - 50
IEEE Computational Intelligence Magazine - August 2021 - 51
IEEE Computational Intelligence Magazine - August 2021 - 52
IEEE Computational Intelligence Magazine - August 2021 - 53
IEEE Computational Intelligence Magazine - August 2021 - 54
IEEE Computational Intelligence Magazine - August 2021 - 55
IEEE Computational Intelligence Magazine - August 2021 - 56
IEEE Computational Intelligence Magazine - August 2021 - 57
IEEE Computational Intelligence Magazine - August 2021 - 58
IEEE Computational Intelligence Magazine - August 2021 - 59
IEEE Computational Intelligence Magazine - August 2021 - 60
IEEE Computational Intelligence Magazine - August 2021 - 61
IEEE Computational Intelligence Magazine - August 2021 - 62
IEEE Computational Intelligence Magazine - August 2021 - 63
IEEE Computational Intelligence Magazine - August 2021 - 64
IEEE Computational Intelligence Magazine - August 2021 - 65
IEEE Computational Intelligence Magazine - August 2021 - 66
IEEE Computational Intelligence Magazine - August 2021 - 67
IEEE Computational Intelligence Magazine - August 2021 - 68
IEEE Computational Intelligence Magazine - August 2021 - 69
IEEE Computational Intelligence Magazine - August 2021 - 70
IEEE Computational Intelligence Magazine - August 2021 - 71
IEEE Computational Intelligence Magazine - August 2021 - 72
IEEE Computational Intelligence Magazine - August 2021 - 73
IEEE Computational Intelligence Magazine - August 2021 - 74
IEEE Computational Intelligence Magazine - August 2021 - 75
IEEE Computational Intelligence Magazine - August 2021 - 76
IEEE Computational Intelligence Magazine - August 2021 - 77
IEEE Computational Intelligence Magazine - August 2021 - 78
IEEE Computational Intelligence Magazine - August 2021 - 79
IEEE Computational Intelligence Magazine - August 2021 - 80
IEEE Computational Intelligence Magazine - August 2021 - 81
IEEE Computational Intelligence Magazine - August 2021 - 82
IEEE Computational Intelligence Magazine - August 2021 - 83
IEEE Computational Intelligence Magazine - August 2021 - 84
IEEE Computational Intelligence Magazine - August 2021 - 85
IEEE Computational Intelligence Magazine - August 2021 - 86
IEEE Computational Intelligence Magazine - August 2021 - 87
IEEE Computational Intelligence Magazine - August 2021 - 88
IEEE Computational Intelligence Magazine - August 2021 - 89
IEEE Computational Intelligence Magazine - August 2021 - 90
IEEE Computational Intelligence Magazine - August 2021 - 91
IEEE Computational Intelligence Magazine - August 2021 - 92
IEEE Computational Intelligence Magazine - August 2021 - 93
IEEE Computational Intelligence Magazine - August 2021 - 94
IEEE Computational Intelligence Magazine - August 2021 - 95
IEEE Computational Intelligence Magazine - August 2021 - 96
IEEE Computational Intelligence Magazine - August 2021 - Cover3
IEEE Computational Intelligence Magazine - August 2021 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com