IEEE Circuits and Systems Magazine - Q2 2023 - 22

of size 33× . Each convolutional layer excluding
the first one was replaced by the TT-conv layer as
mentioned before. The authors also compared the
proposed Tensor Train decomposition method for
the convolutional layer with the naive approach
which directly applies the Tensor Train decomposition
to the 4th -order convolutional kernel. As
illustrated in Table 2, the proposed approach can
achieve similar accuracies as the naive baseline
on the 2× compression level. The second network
was adapted from the first one by replacing the
average pooling with two fully connected layers of
size 8192 1536×
and 1536 512×
. Initially, the fullyconnected
part was the memory bottleneck. As
shown in Table 3, a 21 01. × network compression
with 07
.% accuracy drop can be achieved by only
replacing the fully connected layers with TT-FC
layers as described before. Then the bottleneck
moves to the convolutional part. At this point,
by additionally replacing the convolutional layTable
2.
Compressing convolution-dominated CNN on the
CIFAR-10 dataset [21]. Different rows with the same
model name represent different choices of the
TT-ranks. CR stands for compression rate.
Model
conv (original)
TT-conv
TT-conv
TT-conv
TT-conv
TT-conv (naive)
TT-conv (naive)
Top-1 accuracy
90.7%
89.9%
89.2%
89.3%
88.7%
88.3%
87.6%
CR
-
2.02×
2.53×
3.23×
4.02×
2.02×
2.90×
Table 3.
Compressing FC-dominated CNN on the CIFAR-10
dataset [21]. Different rows with the same model
name represent different choices of the TT-ranks.
CR stands for compression rate.
Model
conv-FC
(original)
conv-TT-FC
conv-TT-FC
conv-TT-FC
TT-conv-TT-FC
TT-conv-TT-FC
TT-conv-TT-FC
22
Top-1 accuracy
90.5%
90.3%
89.8%
89.8%
90.1%
89.7%
89.4%
IEEE CIRCUITS AND SYSTEMS MAGAZINE
CR
-
10.72×
19.38×
21.01×
9.69×
41.65×
82.87×
ers with the TT-conv layers, an 82 87. × network
compression with 11
.% accuracy drop can be
achieved as shown in Table 3. More details can be
found in [21].
2) CP Decomposition: In [17], CP decomposition
was implemented in two steps: applying CP decomposition
to the convolutional layer using the
NLS algorithm and then finetuning the entire
network using backpropagation. Two network
architectures were tested: small character-classification
CNN from [68] and AlexNet [1]. The
character-classification CNN, called CharNet,
has four convolutional layers. This CNN was
used to classify images of size 24 24×
into one
of 36 classes (10 digits +26 characters). Only the
second and the third convolutional layers were
compressed since they cost more than 90% of
processing time. The second layer has 48 input
channels and 128 output channels with filters
of size 99× . The third layer has 48 input channels
and 128 output channels with filters of size
88× . First, the second layer was compressed
using CP decomposition with rank 64. Then all
layers but the new ones were fine-tuned to reduce
the accuracy drop. Finally, the third layer
was approximated using CP decomposition with
rank 64. Since the last approximation does not
lead to a large accuracy drop, there is no need to
fine-tune the network after that. The compressed
network is 8.5 times faster than the original one
while the classification accuracy only drops by
1% to 90 2.%. AlexNet is one of the common object
recognition networks and it has eight layers
consisting of five convolution layers and three
fully connected layers. The second convolutional
layer of AlexNet was compressed in [17] by using
CP decomposition. The running time of the
second layer can be accelerated by 36. × using
a rank of 200 at the expense of 05
.% accuracy
degradation or by 45. × with a rank of 140 at the
expense of ≈1% accuracy degradation. Another
fact mentioned in [17] is that greedy CP decompositions
like ALS work worse than NLS for CNN
model compression.
Tensor Power Method (TPM) [69] was used to apply
CP decomposition to the convolutional kernels
in [15]. Compared to ALS, TPM can achieve the same
variance with less rank since the rank-1 tensors found
in the early steps of TPM represents most of the variances
in the original tensor. TPM compressed the
convolutional kernels by adding rank-1 tensors until
a predefined number of rank-1 tensors is found. First,
TPM finds a rank-1 tensor K1 by minimizing KK−
1 2 .
SECOND QUARTER 2023

IEEE Circuits and Systems Magazine - Q2 2023

Table of Contents for the Digital Edition of IEEE Circuits and Systems Magazine - Q2 2023

Contents
IEEE Circuits and Systems Magazine - Q2 2023 - Cover1
IEEE Circuits and Systems Magazine - Q2 2023 - Cover2
IEEE Circuits and Systems Magazine - Q2 2023 - Contents
IEEE Circuits and Systems Magazine - Q2 2023 - 2
IEEE Circuits and Systems Magazine - Q2 2023 - 3
IEEE Circuits and Systems Magazine - Q2 2023 - 4
IEEE Circuits and Systems Magazine - Q2 2023 - 5
IEEE Circuits and Systems Magazine - Q2 2023 - 6
IEEE Circuits and Systems Magazine - Q2 2023 - 7
IEEE Circuits and Systems Magazine - Q2 2023 - 8
IEEE Circuits and Systems Magazine - Q2 2023 - 9
IEEE Circuits and Systems Magazine - Q2 2023 - 10
IEEE Circuits and Systems Magazine - Q2 2023 - 11
IEEE Circuits and Systems Magazine - Q2 2023 - 12
IEEE Circuits and Systems Magazine - Q2 2023 - 13
IEEE Circuits and Systems Magazine - Q2 2023 - 14
IEEE Circuits and Systems Magazine - Q2 2023 - 15
IEEE Circuits and Systems Magazine - Q2 2023 - 16
IEEE Circuits and Systems Magazine - Q2 2023 - 17
IEEE Circuits and Systems Magazine - Q2 2023 - 18
IEEE Circuits and Systems Magazine - Q2 2023 - 19
IEEE Circuits and Systems Magazine - Q2 2023 - 20
IEEE Circuits and Systems Magazine - Q2 2023 - 21
IEEE Circuits and Systems Magazine - Q2 2023 - 22
IEEE Circuits and Systems Magazine - Q2 2023 - 23
IEEE Circuits and Systems Magazine - Q2 2023 - 24
IEEE Circuits and Systems Magazine - Q2 2023 - 25
IEEE Circuits and Systems Magazine - Q2 2023 - 26
IEEE Circuits and Systems Magazine - Q2 2023 - 27
IEEE Circuits and Systems Magazine - Q2 2023 - 28
IEEE Circuits and Systems Magazine - Q2 2023 - 29
IEEE Circuits and Systems Magazine - Q2 2023 - 30
IEEE Circuits and Systems Magazine - Q2 2023 - 31
IEEE Circuits and Systems Magazine - Q2 2023 - 32
IEEE Circuits and Systems Magazine - Q2 2023 - 33
IEEE Circuits and Systems Magazine - Q2 2023 - 34
IEEE Circuits and Systems Magazine - Q2 2023 - 35
IEEE Circuits and Systems Magazine - Q2 2023 - 36
IEEE Circuits and Systems Magazine - Q2 2023 - 37
IEEE Circuits and Systems Magazine - Q2 2023 - 38
IEEE Circuits and Systems Magazine - Q2 2023 - 39
IEEE Circuits and Systems Magazine - Q2 2023 - 40
IEEE Circuits and Systems Magazine - Q2 2023 - 41
IEEE Circuits and Systems Magazine - Q2 2023 - 42
IEEE Circuits and Systems Magazine - Q2 2023 - 43
IEEE Circuits and Systems Magazine - Q2 2023 - 44
IEEE Circuits and Systems Magazine - Q2 2023 - 45
IEEE Circuits and Systems Magazine - Q2 2023 - 46
IEEE Circuits and Systems Magazine - Q2 2023 - 47
IEEE Circuits and Systems Magazine - Q2 2023 - 48
IEEE Circuits and Systems Magazine - Q2 2023 - 49
IEEE Circuits and Systems Magazine - Q2 2023 - 50
IEEE Circuits and Systems Magazine - Q2 2023 - 51
IEEE Circuits and Systems Magazine - Q2 2023 - 52
IEEE Circuits and Systems Magazine - Q2 2023 - 53
IEEE Circuits and Systems Magazine - Q2 2023 - 54
IEEE Circuits and Systems Magazine - Q2 2023 - 55
IEEE Circuits and Systems Magazine - Q2 2023 - 56
IEEE Circuits and Systems Magazine - Q2 2023 - 57
IEEE Circuits and Systems Magazine - Q2 2023 - 58
IEEE Circuits and Systems Magazine - Q2 2023 - 59
IEEE Circuits and Systems Magazine - Q2 2023 - 60
IEEE Circuits and Systems Magazine - Q2 2023 - 61
IEEE Circuits and Systems Magazine - Q2 2023 - 62
IEEE Circuits and Systems Magazine - Q2 2023 - 63
IEEE Circuits and Systems Magazine - Q2 2023 - 64
IEEE Circuits and Systems Magazine - Q2 2023 - 65
IEEE Circuits and Systems Magazine - Q2 2023 - 66
IEEE Circuits and Systems Magazine - Q2 2023 - 67
IEEE Circuits and Systems Magazine - Q2 2023 - 68
IEEE Circuits and Systems Magazine - Q2 2023 - 69
IEEE Circuits and Systems Magazine - Q2 2023 - 70
IEEE Circuits and Systems Magazine - Q2 2023 - 71
IEEE Circuits and Systems Magazine - Q2 2023 - 72
IEEE Circuits and Systems Magazine - Q2 2023 - Cover3
IEEE Circuits and Systems Magazine - Q2 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2023Q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2023Q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2023Q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021Q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q1
https://www.nxtbookmedia.com