IEEE Circuits and Systems Magazine - Q4 2020 - 70

C. Evolution of CNNs
In addition to being motivated by Hubel's discovery [5],
CNN can also be improved based on another observation in neuroscience. There is an activation function,
commonly sigmoid function, performing the nonlinear
transformation to the outcome of each convolutional
kernel. However, nearly half neurons of the network are
activated while using sigmoid function to compute the
activation, which goes against the neuroscience observation that only 1% ~ 4% neurons in brain fire simultaneously [34]. Therefore, rectified linear unit (ReLU) is
presented as a more accurate activation model of brain
neurons that contains three characteristics: one-sided
inhibition, wide exciting boundaries and sparsity [35].
In recent years, many variants of CNN have been proposed to improve the network performance. As mentioned
in Section II-A and II-B, the receptive field of convolutional
kernels are transformed in [23] and [24], which increases
the robustness of CNN to the scale and shape transformation of objects. Parallel convolution in GoogLeNet [25] allows the network to capture features with multiple scales
and shortcut connection in ResNet [26] makes it possible
to train a network with more than 100 layers. Based on
ResNet, DenseNet [36] contains denser shortcut connections. Each layer in DenseNet takes as input the feature
maps of all the layers in front of it. That is to say, the features of a shallower layer are reused by all the layers behind, which improves the performance further.
In terms of lightweight model, some studies divide
the convolution operation into many groups to accelerate the inference through reducing the amount of
computation [37], [38]. Given an input feature with size
of 64 # 64 # 28 and a convolutional kernel with size of
9 # 9 # 16, then the amount of computation in a convolution is 28 # 9 # 9 # 16 = 36288. While with the group convolution, the input feature is divided into 2 groups firstly. Then,
each group ^64 # 64 # 14 h is filtered by half of the original
convolutional kernel ^9 # 9 # 8 h respectively. Therefore,
the amount of computation is 14 # 9 # 9 # 8 # 2 = 18144. In
addition, the parameters can be decreased by the separation of the convolution [39]. This variant separates standard convolution (3-D) to convolution inside each channel
(2-D) and 1 # 1 convolution among channels (1-D).
Besides, attention mechanism is also adopted to improve CNN's ability of capturing the key information. In
general, the attention mechanism is implemented via
multiplying a mask with corresponding attention domain.
On the one hand, the mask can be adopted in the spatial
domain to highlight the significant spatial area in feature
maps [40]. On the other hand, attention mechanism can be
applied in channel domain to focus on key feature channels
[41]. Moreover, it is also practicable to introduce attention
mechanism into both spatial and channel domain [42].
70 	

IV. From Memory to RNNs
This section mainly focuses on recurrent neural network (RNN) and how it is motivated from the perspective of information transmission.
A. Motivation on Information Transmission of RNN
Even the simplest human behavior can be broken down
into a variety of serially ordered action sequences. Therefore, we need a model to process the sequential patterns
of data like speech and context. An intuitive approach is
to process temporal data spatially. However, one shortcoming of this kind of methods is that the duration of
input patterns is required to be fixed, which breaks the
completeness of temporal sequential information. Another problem is that, it is hard for such an approach to
recognize absolute displacements of patterns with similar relative structure.
For this reason, Elman proposes a network architecture
with a structure of recurrent connection that represents
time implicitly [11]. This new type of neural network is
named as recurrent neural network (RNN). The network
will make an inference for the data of each time step. In addition, the recurrent connection constructs a path from the
hidden state of the previous time step to the input of the
current time step. Therefore, RNN can reserve the " memory " about the past information when making current inference. To some extend, this process actually fits the cognition mechanism of human brains: the understanding of
an object mainly depends on our memory rather than the
information we receive currently.
B. Details of Architecture
RNN takes as input one element of the sequential data
at each time step. Meanwhile, as shown in Fig. 4(a),
there is an additional layer of context neurons that can
preserve the hidden state of the previous time step. At
each time step t, the hidden layer will give a hidden state s t
through integrating the new information from input x t and
previous information, i.e., hidden state s t - 1 (see Fig. 4(b)).
This process can be formulated as:
	
	

s t = h ^U · x t - 1 + W · s t - 1 + b h h,(2)
y t = g ^V ·s t + b g h,(3)

where b h and b g are biases, h denotes a hyperbolic tangent (tanh) function and g denotes a softmax function.
Due to that all functions that are adopted to obtain
the actual output of RNN are differentiable, the network
architecture can be trained using backpropagation [44].
The backpropagation algorithm for RNN is essentially
equivalent to that for other network architectures. It is
important to notice that the process of updating W and
U is a little complex. For W, assuming that L t is the error

IEEE CIRCUITS AND SYSTEMS MAGAZINE 		

FOURTH QUARTER 2020



IEEE Circuits and Systems Magazine - Q4 2020

Table of Contents for the Digital Edition of IEEE Circuits and Systems Magazine - Q4 2020

Contents
IEEE Circuits and Systems Magazine - Q4 2020 - Cover1
IEEE Circuits and Systems Magazine - Q4 2020 - Cover2
IEEE Circuits and Systems Magazine - Q4 2020 - Contents
IEEE Circuits and Systems Magazine - Q4 2020 - 2
IEEE Circuits and Systems Magazine - Q4 2020 - 3
IEEE Circuits and Systems Magazine - Q4 2020 - 4
IEEE Circuits and Systems Magazine - Q4 2020 - 5
IEEE Circuits and Systems Magazine - Q4 2020 - 6
IEEE Circuits and Systems Magazine - Q4 2020 - 7
IEEE Circuits and Systems Magazine - Q4 2020 - 8
IEEE Circuits and Systems Magazine - Q4 2020 - 9
IEEE Circuits and Systems Magazine - Q4 2020 - 10
IEEE Circuits and Systems Magazine - Q4 2020 - 11
IEEE Circuits and Systems Magazine - Q4 2020 - 12
IEEE Circuits and Systems Magazine - Q4 2020 - 13
IEEE Circuits and Systems Magazine - Q4 2020 - 14
IEEE Circuits and Systems Magazine - Q4 2020 - 15
IEEE Circuits and Systems Magazine - Q4 2020 - 16
IEEE Circuits and Systems Magazine - Q4 2020 - 17
IEEE Circuits and Systems Magazine - Q4 2020 - 18
IEEE Circuits and Systems Magazine - Q4 2020 - 19
IEEE Circuits and Systems Magazine - Q4 2020 - 20
IEEE Circuits and Systems Magazine - Q4 2020 - 21
IEEE Circuits and Systems Magazine - Q4 2020 - 22
IEEE Circuits and Systems Magazine - Q4 2020 - 23
IEEE Circuits and Systems Magazine - Q4 2020 - 24
IEEE Circuits and Systems Magazine - Q4 2020 - 25
IEEE Circuits and Systems Magazine - Q4 2020 - 26
IEEE Circuits and Systems Magazine - Q4 2020 - 27
IEEE Circuits and Systems Magazine - Q4 2020 - 28
IEEE Circuits and Systems Magazine - Q4 2020 - 29
IEEE Circuits and Systems Magazine - Q4 2020 - 30
IEEE Circuits and Systems Magazine - Q4 2020 - 31
IEEE Circuits and Systems Magazine - Q4 2020 - 32
IEEE Circuits and Systems Magazine - Q4 2020 - 33
IEEE Circuits and Systems Magazine - Q4 2020 - 34
IEEE Circuits and Systems Magazine - Q4 2020 - 35
IEEE Circuits and Systems Magazine - Q4 2020 - 36
IEEE Circuits and Systems Magazine - Q4 2020 - 37
IEEE Circuits and Systems Magazine - Q4 2020 - 38
IEEE Circuits and Systems Magazine - Q4 2020 - 39
IEEE Circuits and Systems Magazine - Q4 2020 - 40
IEEE Circuits and Systems Magazine - Q4 2020 - 41
IEEE Circuits and Systems Magazine - Q4 2020 - 42
IEEE Circuits and Systems Magazine - Q4 2020 - 43
IEEE Circuits and Systems Magazine - Q4 2020 - 44
IEEE Circuits and Systems Magazine - Q4 2020 - 45
IEEE Circuits and Systems Magazine - Q4 2020 - 46
IEEE Circuits and Systems Magazine - Q4 2020 - 47
IEEE Circuits and Systems Magazine - Q4 2020 - 48
IEEE Circuits and Systems Magazine - Q4 2020 - 49
IEEE Circuits and Systems Magazine - Q4 2020 - 50
IEEE Circuits and Systems Magazine - Q4 2020 - 51
IEEE Circuits and Systems Magazine - Q4 2020 - 52
IEEE Circuits and Systems Magazine - Q4 2020 - 53
IEEE Circuits and Systems Magazine - Q4 2020 - 54
IEEE Circuits and Systems Magazine - Q4 2020 - 55
IEEE Circuits and Systems Magazine - Q4 2020 - 56
IEEE Circuits and Systems Magazine - Q4 2020 - 57
IEEE Circuits and Systems Magazine - Q4 2020 - 58
IEEE Circuits and Systems Magazine - Q4 2020 - 59
IEEE Circuits and Systems Magazine - Q4 2020 - 60
IEEE Circuits and Systems Magazine - Q4 2020 - 61
IEEE Circuits and Systems Magazine - Q4 2020 - 62
IEEE Circuits and Systems Magazine - Q4 2020 - 63
IEEE Circuits and Systems Magazine - Q4 2020 - 64
IEEE Circuits and Systems Magazine - Q4 2020 - 65
IEEE Circuits and Systems Magazine - Q4 2020 - 66
IEEE Circuits and Systems Magazine - Q4 2020 - 67
IEEE Circuits and Systems Magazine - Q4 2020 - 68
IEEE Circuits and Systems Magazine - Q4 2020 - 69
IEEE Circuits and Systems Magazine - Q4 2020 - 70
IEEE Circuits and Systems Magazine - Q4 2020 - 71
IEEE Circuits and Systems Magazine - Q4 2020 - 72
IEEE Circuits and Systems Magazine - Q4 2020 - 73
IEEE Circuits and Systems Magazine - Q4 2020 - 74
IEEE Circuits and Systems Magazine - Q4 2020 - 75
IEEE Circuits and Systems Magazine - Q4 2020 - 76
IEEE Circuits and Systems Magazine - Q4 2020 - 77
IEEE Circuits and Systems Magazine - Q4 2020 - 78
IEEE Circuits and Systems Magazine - Q4 2020 - 79
IEEE Circuits and Systems Magazine - Q4 2020 - 80
IEEE Circuits and Systems Magazine - Q4 2020 - 81
IEEE Circuits and Systems Magazine - Q4 2020 - 82
IEEE Circuits and Systems Magazine - Q4 2020 - 83
IEEE Circuits and Systems Magazine - Q4 2020 - 84
IEEE Circuits and Systems Magazine - Q4 2020 - 85
IEEE Circuits and Systems Magazine - Q4 2020 - 86
IEEE Circuits and Systems Magazine - Q4 2020 - 87
IEEE Circuits and Systems Magazine - Q4 2020 - 88
IEEE Circuits and Systems Magazine - Q4 2020 - Cover3
IEEE Circuits and Systems Magazine - Q4 2020 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2023Q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2023Q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2023Q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021Q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q1
https://www.nxtbookmedia.com