IEEE Computational Intelligence Magazine - February 2023 - 40

The FGN architecture modification allows graph problems
to be re-framed, by transforming a node classification task into
a feature graph classification task. This then enables the feature
graphs to be trained in a regular mini-batch manner similar to
CNNs [74]. By converting graphs into a representation that
suits common CNNs tasks, FGN enables the input of the
graph's node to be processed independently whilst new
knowledge is being learned, without requiring the entire
graph to be pre-processed. This also changes some graph relationships:
fixed feature dimensions in the regular graph translate
to a fixed number of nodes in the feature graph, and the
addition ofnodes to the adjacency matrix of the regular graph
translates to the addition of adjacency matrices of feature
graphs. The cross-correlation of connected feature vectors is
accumulated to develop feature interactions that are useful for
constructing feature adjacency. Moreover, in order to develop
a feature graph that is a new topology given by the feature
adjacency matrix, specific types of layers can be used to
develop the feature graph, such as the feature broadcast layer,
feature transform layer, and feature attention layer.
One of the methods to calculate the performance of this
model uses data-incremental and class-incremental scenarios.
The experiments use several datasets such as Cora [75],
Citeseer [75], Pubmed [75], and Ogbn-arXiv [76], [77].
FGN is compared with other graph models such as
GCN [27], graph attention networks (GAT) [78], graphSage
[35], and approximation personalized propagation of
neural predictions (APPNP) [79] that are implemented for
lifelong learning settings without knowledge consolidations.
On-data incremental scenarios, all new samples are randomly
inserted. The result shows to get 0.872 0.009 of
overall performance on the Pubmed dataset. In the classincremental
scenario, all samples from each class are inserted
into the model before moving to another class, and it gets
0.857 0.003 of performance on the Pubmed dataset.
Moreover, it gets a forgetting rate value of 3.49, which is
the lowest forgetting rate compared to the other models. It
shows effectiveness by achieving superior performance in
both data-incremental and class-incremental tasks.
B. Hierarchical Prototype Networks
Hierarchical Prototype Networks by Zhang et al. [67] extract
different levels ofknowledge abstraction in the form ofprototypes,
which are used to represent the evolving graph over
time. This method aims to accommodate incoming experience
whilst retaining prior knowledge. It firstly leverages a set of
atomic feature extractors (AFEs) to encode attribute information
and topological structures of target nodes. There are two
kinds ofatomic embeddings extracted in this process. The first
one is atomic node embedding AFEnode to generate an embedding
representation of node v's feature information, Enode
A ðvÞ.
The second one is atomic structure embedding AFEstruct. The
atomic structure embedding ofnode v, denoted Estruct
A ðvÞ, encodes
the relations between nodes within multiple hops of
node v. Then, HPNs perform several functions to adaptively
40 IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | FEBRUARY 2023
select, compose, and store the embedding representation
through three architecture layers of prototypes: atomic-level,
node-level, and class-level. First, the atomic-level prototypes
(A-prototypes) PA are created by atomic embedding sets,
which describe low-level features of a node that is similar to
the output of the first layer in deep neural networks. Then
node-level (N-prototypes) PN and class-level prototypes (Cprototypes)
PC are developed to capture high-level information
of nodes. N-prototypes PN are developed by capturing
the relation ofeach node in A-prototypes PA. Then, C-prototypes
PC are generated from N-prototypes PN to explain the
common feature of a node group. All those conversions use
the feature map of convolutional neural networks at different
layers.
HPNs choose only relevant atomic embeddings to refine
existing prototypes or create new prototypes to maintain old
knowledge and accommodate new knowledge. HPNs use
maximum cosine similarity to calculate the similarity between
incoming atomic embedding and atomic embedding on existing
prototypes to select the relevant knowledge. The result of
maximum cosine similarity is then ranked and taken according
to the number specified in the hyperparameters. The HPNs
then determine which new embedding has the same or different
characteristics as the elements in the A-prototypes PA
using cosine similarity. On one hand, when new embeddings
are considered close to existing prototypes, the old knowledge
will be refined by minimizing the loss between the two. On
the other hand, atomic embeddings that are not close to the
atomic embedding ofexisting prototypes will be considered as
new tasks, so a new prototype will be degenerated by combining
previous and new prototypes. In the scenario that new
embeddings EnewðvÞ have similar representation to the previous
representation, that can cause redundant knowledge. In
order to avoid this, the model filters EnewðvÞ to get E0
newðvÞ
which only contain distinct representations (as measured using
the threshold tA):
8ei; ej 2 E0
newðvÞ;
eT
i ej
keik2kejk2
< tA:
Using those strategies, the embeddings in E0
PA ¼ PA [ E0
newðvÞ:
(1)
newðvÞ will only
contain new features. Then the A-prototypes PA are updated
to accommodate new embeddings.
(2)
After that process, the selected A-prototypes PA can be
further matched to PN and C-prototypes PC to get hierarchical
representation, which is used to feed the classifier to perform
node classification.
The architectural approach to HPNs that activates hierarchical
representative embedding and iterative prototypes to
accommodate new knowledge achieves state-of-the-art performance
based on accuracy mean and forgetting mean to
enable lifelong learning in graph data. It is compared with general
lifelong learning, such as EWC [19], LWF [20], and

IEEE Computational Intelligence Magazine - February 2023

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - February 2023

Contents
IEEE Computational Intelligence Magazine - February 2023 - Cover1
IEEE Computational Intelligence Magazine - February 2023 - Cover2
IEEE Computational Intelligence Magazine - February 2023 - Contents
IEEE Computational Intelligence Magazine - February 2023 - 2
IEEE Computational Intelligence Magazine - February 2023 - 3
IEEE Computational Intelligence Magazine - February 2023 - 4
IEEE Computational Intelligence Magazine - February 2023 - 5
IEEE Computational Intelligence Magazine - February 2023 - 6
IEEE Computational Intelligence Magazine - February 2023 - 7
IEEE Computational Intelligence Magazine - February 2023 - 8
IEEE Computational Intelligence Magazine - February 2023 - 9
IEEE Computational Intelligence Magazine - February 2023 - 10
IEEE Computational Intelligence Magazine - February 2023 - 11
IEEE Computational Intelligence Magazine - February 2023 - 12
IEEE Computational Intelligence Magazine - February 2023 - 13
IEEE Computational Intelligence Magazine - February 2023 - 14
IEEE Computational Intelligence Magazine - February 2023 - 15
IEEE Computational Intelligence Magazine - February 2023 - 16
IEEE Computational Intelligence Magazine - February 2023 - 17
IEEE Computational Intelligence Magazine - February 2023 - 18
IEEE Computational Intelligence Magazine - February 2023 - 19
IEEE Computational Intelligence Magazine - February 2023 - 20
IEEE Computational Intelligence Magazine - February 2023 - 21
IEEE Computational Intelligence Magazine - February 2023 - 22
IEEE Computational Intelligence Magazine - February 2023 - 23
IEEE Computational Intelligence Magazine - February 2023 - 24
IEEE Computational Intelligence Magazine - February 2023 - 25
IEEE Computational Intelligence Magazine - February 2023 - 26
IEEE Computational Intelligence Magazine - February 2023 - 27
IEEE Computational Intelligence Magazine - February 2023 - 28
IEEE Computational Intelligence Magazine - February 2023 - 29
IEEE Computational Intelligence Magazine - February 2023 - 30
IEEE Computational Intelligence Magazine - February 2023 - 31
IEEE Computational Intelligence Magazine - February 2023 - 32
IEEE Computational Intelligence Magazine - February 2023 - 33
IEEE Computational Intelligence Magazine - February 2023 - 34
IEEE Computational Intelligence Magazine - February 2023 - 35
IEEE Computational Intelligence Magazine - February 2023 - 36
IEEE Computational Intelligence Magazine - February 2023 - 37
IEEE Computational Intelligence Magazine - February 2023 - 38
IEEE Computational Intelligence Magazine - February 2023 - 39
IEEE Computational Intelligence Magazine - February 2023 - 40
IEEE Computational Intelligence Magazine - February 2023 - 41
IEEE Computational Intelligence Magazine - February 2023 - 42
IEEE Computational Intelligence Magazine - February 2023 - 43
IEEE Computational Intelligence Magazine - February 2023 - 44
IEEE Computational Intelligence Magazine - February 2023 - 45
IEEE Computational Intelligence Magazine - February 2023 - 46
IEEE Computational Intelligence Magazine - February 2023 - 47
IEEE Computational Intelligence Magazine - February 2023 - 48
IEEE Computational Intelligence Magazine - February 2023 - 49
IEEE Computational Intelligence Magazine - February 2023 - 50
IEEE Computational Intelligence Magazine - February 2023 - 51
IEEE Computational Intelligence Magazine - February 2023 - 52
IEEE Computational Intelligence Magazine - February 2023 - 53
IEEE Computational Intelligence Magazine - February 2023 - 54
IEEE Computational Intelligence Magazine - February 2023 - 55
IEEE Computational Intelligence Magazine - February 2023 - 56
IEEE Computational Intelligence Magazine - February 2023 - 57
IEEE Computational Intelligence Magazine - February 2023 - 58
IEEE Computational Intelligence Magazine - February 2023 - 59
IEEE Computational Intelligence Magazine - February 2023 - 60
IEEE Computational Intelligence Magazine - February 2023 - 61
IEEE Computational Intelligence Magazine - February 2023 - 62
IEEE Computational Intelligence Magazine - February 2023 - 63
IEEE Computational Intelligence Magazine - February 2023 - 64
IEEE Computational Intelligence Magazine - February 2023 - 65
IEEE Computational Intelligence Magazine - February 2023 - 66
IEEE Computational Intelligence Magazine - February 2023 - 67
IEEE Computational Intelligence Magazine - February 2023 - 68
IEEE Computational Intelligence Magazine - February 2023 - 69
IEEE Computational Intelligence Magazine - February 2023 - 70
IEEE Computational Intelligence Magazine - February 2023 - 71
IEEE Computational Intelligence Magazine - February 2023 - 72
IEEE Computational Intelligence Magazine - February 2023 - 73
IEEE Computational Intelligence Magazine - February 2023 - 74
IEEE Computational Intelligence Magazine - February 2023 - 75
IEEE Computational Intelligence Magazine - February 2023 - 76
IEEE Computational Intelligence Magazine - February 2023 - 77
IEEE Computational Intelligence Magazine - February 2023 - 78
IEEE Computational Intelligence Magazine - February 2023 - 79
IEEE Computational Intelligence Magazine - February 2023 - 80
IEEE Computational Intelligence Magazine - February 2023 - 81
IEEE Computational Intelligence Magazine - February 2023 - 82
IEEE Computational Intelligence Magazine - February 2023 - 83
IEEE Computational Intelligence Magazine - February 2023 - 84
IEEE Computational Intelligence Magazine - February 2023 - 85
IEEE Computational Intelligence Magazine - February 2023 - 86
IEEE Computational Intelligence Magazine - February 2023 - 87
IEEE Computational Intelligence Magazine - February 2023 - 88
IEEE Computational Intelligence Magazine - February 2023 - 89
IEEE Computational Intelligence Magazine - February 2023 - 90
IEEE Computational Intelligence Magazine - February 2023 - 91
IEEE Computational Intelligence Magazine - February 2023 - 92
IEEE Computational Intelligence Magazine - February 2023 - 93
IEEE Computational Intelligence Magazine - February 2023 - 94
IEEE Computational Intelligence Magazine - February 2023 - 95
IEEE Computational Intelligence Magazine - February 2023 - 96
IEEE Computational Intelligence Magazine - February 2023 - 97
IEEE Computational Intelligence Magazine - February 2023 - 98
IEEE Computational Intelligence Magazine - February 2023 - 99
IEEE Computational Intelligence Magazine - February 2023 - 100
IEEE Computational Intelligence Magazine - February 2023 - 101
IEEE Computational Intelligence Magazine - February 2023 - 102
IEEE Computational Intelligence Magazine - February 2023 - 103
IEEE Computational Intelligence Magazine - February 2023 - 104
IEEE Computational Intelligence Magazine - February 2023 - Cover3
IEEE Computational Intelligence Magazine - February 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com