IEEE Circuits and Systems Magazine - Q2 2020 - 37

Classical HD computing applications in classification tasks
can be categorized based on their input data types,
namely letters, signals and images.
different HD applications, making FPGA a nice platform
for hardware acceleration [38]. Moreover, as proposed
in [39], combining HD computing with the concept of inmemory computing, which is featured as RAM storage
and parallel distribution, may create opportunities for
HD acceleration. Additionally, several emerging nanotechnologies, including carbon nanotube field-effect
transistors (CNFETs) [40], resistive RAM (RRAM) [9],
and monolithic 3 D integration [41], have demonstrated
implementations of HD computing at high speed [40]. Dimensionality reduction has been evaluated in an actual
prototyped system using vertical RRAM (VRRAM) inmemory kernels in [42].
IV. Applications in HD Classification
In what follows, some classical HD computing applications in classification tasks as well as several novel
design approaches that can balance tradeoff of accuracy and efficiency are described. They are categorized
based on their input data types, namely letters, signals
and images.
A. Letters
1) European Language Recognition
Using HD Computing
HD computing for European language recognition was
first explored by [30]. Literature [40] presents an HD computing nanosystem, which implements HD operations
based on emerging nanotechnologies-CNFETs, RRAM
and 3 D integration-offering large arrays of memory
and resulting in reduction of energy consumption. From
its three-letter sequence called trigrams, such a nanosystem can identify the language of a given sentence [40].
Define a profile by a histogram of trigram frequencies in
the unclassified text. The basic idea is to compare the trigram profile of a test sentence with the trigram profiles
of 21 languages, and then find the target language which
has the most similar trigram profile [30].
■■ Baseline. Scan through the text and count the trigram to compute a profile. A total of 27 3 = 19, 683
trigrams are possible for the 26 letters and the
space. Thus the trigram counts can be encoded
into a 19,683-dimensional vector and such vectors
can be compared to find the language with the
most similar profile. However, this straightforward
and simple approach generalizes poorly. SpecifiSECOND QUARTER 2020 		

cally, compared to trigrams, higher-order N-grams
will have higher complexity. For example, the number of possible pentagrams is 27 5 = 14, 348, 907.
■■ HD classification algorithm. 1). Choose a set of 27
letter hypervectors randomly, serving as the seed
hypervector. Note that all training and test data
employ the same seeds. In this design, the dimensionality is selected to be 10,000. 2). Generate trigram hypervectors with permutation and multiplication. For example, let (a, b, c) represent a trigram.
Then rotate the hypervector A twice, hypervector
B once, and use hypervector C with no change, and
then multiply them component by component as
described in Eq. (13). 3). The target profile hypervector is then the sum of all the trigram hypervectors in the text. 4). Compare the profile of a test sentence to the language profiles, and return the most
similar one as the classification result.
Compared to the baseline algorithm, the HD algorithm
generalizes better to any N-gram size when 10,000-dimensional hypervectors are used.
The HD classification hardware architecture for language recognition using trigrams proposed in [21] is
shown in Fig. 9. Two main modules are implemented. They
include the encoding module and the search module. 1).
The encoding module takes a stream of letters as the input. Each letter is mapped to the HD space and its corresponding randomly generated hypervector is stored in
the item memory. Here it addresses the trigrams where
each group of three hypervectors produces a trigram
hypervector. Accumulate those trigram hypervectors
and perform the majority operation using the threshold
to generate a text hypervector. 2). During the training
phase, a total of 21 text hypervectors are trained as the
learned class hypervectors and are stored in the associative memory in the search module. During the testing
phase, the encoding module generates the text hypervector as a query hypervector. This query hypervector
is then broadcast to the search module and compared
to the stored class hypervectors to predict the language
label, which has the closest similarity. As listed in Table 4,
the HD classifier achieves 96.70% accuracy.
Using the same architecture shown in Fig. 9, and combining with the emerging nanotechnologies-CNFETs,
RRAM and their monolithic 3 D integration-the HD computing hardware implementation achieves classification accuracy up to 98% for over 2 20, 000 -sentences [40].
IEEE CIRCUITS AND SYSTEMS MAGAZINE	

37



IEEE Circuits and Systems Magazine - Q2 2020

Table of Contents for the Digital Edition of IEEE Circuits and Systems Magazine - Q2 2020

Contents
IEEE Circuits and Systems Magazine - Q2 2020 - Cover1
IEEE Circuits and Systems Magazine - Q2 2020 - Cover2
IEEE Circuits and Systems Magazine - Q2 2020 - Contents
IEEE Circuits and Systems Magazine - Q2 2020 - 2
IEEE Circuits and Systems Magazine - Q2 2020 - 3
IEEE Circuits and Systems Magazine - Q2 2020 - 4
IEEE Circuits and Systems Magazine - Q2 2020 - 5
IEEE Circuits and Systems Magazine - Q2 2020 - 6
IEEE Circuits and Systems Magazine - Q2 2020 - 7
IEEE Circuits and Systems Magazine - Q2 2020 - 8
IEEE Circuits and Systems Magazine - Q2 2020 - 9
IEEE Circuits and Systems Magazine - Q2 2020 - 10
IEEE Circuits and Systems Magazine - Q2 2020 - 11
IEEE Circuits and Systems Magazine - Q2 2020 - 12
IEEE Circuits and Systems Magazine - Q2 2020 - 13
IEEE Circuits and Systems Magazine - Q2 2020 - 14
IEEE Circuits and Systems Magazine - Q2 2020 - 15
IEEE Circuits and Systems Magazine - Q2 2020 - 16
IEEE Circuits and Systems Magazine - Q2 2020 - 17
IEEE Circuits and Systems Magazine - Q2 2020 - 18
IEEE Circuits and Systems Magazine - Q2 2020 - 19
IEEE Circuits and Systems Magazine - Q2 2020 - 20
IEEE Circuits and Systems Magazine - Q2 2020 - 21
IEEE Circuits and Systems Magazine - Q2 2020 - 22
IEEE Circuits and Systems Magazine - Q2 2020 - 23
IEEE Circuits and Systems Magazine - Q2 2020 - 24
IEEE Circuits and Systems Magazine - Q2 2020 - 25
IEEE Circuits and Systems Magazine - Q2 2020 - 26
IEEE Circuits and Systems Magazine - Q2 2020 - 27
IEEE Circuits and Systems Magazine - Q2 2020 - 28
IEEE Circuits and Systems Magazine - Q2 2020 - 29
IEEE Circuits and Systems Magazine - Q2 2020 - 30
IEEE Circuits and Systems Magazine - Q2 2020 - 31
IEEE Circuits and Systems Magazine - Q2 2020 - 32
IEEE Circuits and Systems Magazine - Q2 2020 - 33
IEEE Circuits and Systems Magazine - Q2 2020 - 34
IEEE Circuits and Systems Magazine - Q2 2020 - 35
IEEE Circuits and Systems Magazine - Q2 2020 - 36
IEEE Circuits and Systems Magazine - Q2 2020 - 37
IEEE Circuits and Systems Magazine - Q2 2020 - 38
IEEE Circuits and Systems Magazine - Q2 2020 - 39
IEEE Circuits and Systems Magazine - Q2 2020 - 40
IEEE Circuits and Systems Magazine - Q2 2020 - 41
IEEE Circuits and Systems Magazine - Q2 2020 - 42
IEEE Circuits and Systems Magazine - Q2 2020 - 43
IEEE Circuits and Systems Magazine - Q2 2020 - 44
IEEE Circuits and Systems Magazine - Q2 2020 - 45
IEEE Circuits and Systems Magazine - Q2 2020 - 46
IEEE Circuits and Systems Magazine - Q2 2020 - 47
IEEE Circuits and Systems Magazine - Q2 2020 - 48
IEEE Circuits and Systems Magazine - Q2 2020 - Cover3
IEEE Circuits and Systems Magazine - Q2 2020 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2023Q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2023Q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2023Q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2022Q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021Q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2021q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2020q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2019q1
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q4
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q3
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q2
https://www.nxtbook.com/nxtbooks/ieee/circuitsandsystems_2018q1
https://www.nxtbookmedia.com