IEEE Computational Intelligence Magazine - February 2023 - 42

accommodate new tasks that are completely different from the
previous ones. The model introduces a new measurement to
regulate the evolution in graph data with different characteristics
to address that objective. The method has the capability to
integrate an isotropic and anisotropic type [87] ofGNN.
The problem definition ofthis model is as follows. Assuming
graph data has a finite number of tasks T1; ...; Tt and a
modelf with parameters u. To do incremental training in this
method, while the structure of the graph including its nodes
and edges is changing over time, during preparation for task
Ttþ1, a model should have been trained based on the label on
task Tt to get the uðtÞ parameters. When l new classes come,
these methods add the corresponding number ofparameters to
the output layer offðtÞ. Therefore it will have a new parameterized
layer:
j ut
j ut
output weight j¼jut1
output bias j¼jut1
output weight jþ l; and
output bias jþ l:
(3Þ
(4)
In the next step, to enable the rehearsal approach, this
method gives the option to choose whether to retrain the
model from scratch (cold restart) or use the final model parameters
from the previous tasks (warm restart). The model
involves an artificial history size that determines the amount of
past data to control against full-history retraining.
Since the pre-compiled dataset for graph lifelong learning
is rare, this model performs an experiment on the scientific
publication dataset PharmaBio and DBLP (easy and hard) [88].
The scenario is to learn new tasks based on adding new classes.
Two evaluation measures used average accuracy and forward
transfer to measure the effect ofreusing previous parameters. It
also uses several graph neural networks technique to perform
representation learning in the model, such as GAT [78],
GCN [27], and graphSage [75]. The main result shows that
incremental learning using limited history sizes is almost similar
to the default setting by using the full history size ofthe graph.
GNN technique in this model achieves at least 95% accuracy
that can be retrained with half dataset coverage compared to
using all past data for incremental learning.
V. REGULARIZATION APPROACH
This approach implements a single model and has a fixed
capacity by leveraging the loss function using the loss term to
help consolidate knowledge in the learning process for new
tasks and retain previous knowledge [19], [20]. Prior knowledge
of graph structures and tasks will be maintained to
achieve stable performance while learning novel knowledge.
In graph-based learning, some current works that implement
regularization approaches are DiCGRL [68], GPIL [69],
TWP [70], and Translation-based Knowledge Graph Embedding
[71]. Other methods that partially implement a regularization
approach in graph data that are explained in more detail
in Section VI are ContinualGNN [10], LDANE [72], and
TrafficStream [44].
42 IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | FEBRUARY 2023
A. Disentangle-Based Continual Graph Representation
Learning
Kou et al. [68] proposed Disentangle-based Continual Graph
Representation Learning (DiCGRL), which focuses on applying
lifelong learning in graph embedding. Current graph
embedding models are not practical in real-world applications
because they tend to ignore streaming and incoming data
nature in graphs. DiCGRL takes two approaches to continuously
learn a new graph embedding whilst preventing the forgetting
of old knowledge. First, the disentangle module aims
to convert the graph's relational triplets (u; r; v), which explains
the connection between node u and v based on relation r into
several components in their semantic aspects. The model uses
an attention mechanism to learn the attention weight ofcomponents
u and v based on the relation r. Then, top-n-related
components are processed through a graph embedding mechanism
that incorporates the features ofnodes u and v. There are
two types of graph embedding used to do this: knowledge
graph embedding (KGEs) [89], [90] and network embedding
(NEs) [78], [91].
The second step is updating the module that aims to
update disentangled graph embedding when new relation triplets
appear. This model uses a regularization approach to prevent
the forgetting problem. In this process, a step called
neighbor activation identifies which triplet relation should be
updated by running a selection mechanism inspired by the
human ability to learn procedural knowledge. Top-n components
with common characteristics are selected to update the
embedding representation. Then, when the new data arrive,
the incoming data will be trained with those activated neighbor
components using a constraint in loss function to accommodate
new knowledge and maintain the performance of
prior tasks. DiCGRL employs constraint loss terms Lnorm to
push the sum of attention weights of top-n selected components
reaching 1, i.e.,
Lnorm ¼
ðu;r;vÞTi
X
1
Xn
k
ak
r
;
(5)
where n indicates the number of components. So that the
overall loss function to implement the regularization process in
this method is as follows:
L¼Lold þLnew þ b:Lnorm;
(6)
where b is a hyperparameter.
One of the techniques to measure the performance of
DiCGRL is to use link prediction tasks and node classification
tasks. For link prediction tasks, it uses two datasets, FB15K237
[92] and WN18RR [93], that are set to have instances
that come sequentially. Moreover, two types of knowledge
graph embedding, such as TransE [90] and ConvKB [94], are
used to perform graph representation learning. For node classification
tasks, It uses several datasets such as Cora, CiteSee, and
Pubmed [75] with the scenario of adding instances sequentially.
There are some strategies used to compare the

IEEE Computational Intelligence Magazine - February 2023

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - February 2023

Contents
IEEE Computational Intelligence Magazine - February 2023 - Cover1
IEEE Computational Intelligence Magazine - February 2023 - Cover2
IEEE Computational Intelligence Magazine - February 2023 - Contents
IEEE Computational Intelligence Magazine - February 2023 - 2
IEEE Computational Intelligence Magazine - February 2023 - 3
IEEE Computational Intelligence Magazine - February 2023 - 4
IEEE Computational Intelligence Magazine - February 2023 - 5
IEEE Computational Intelligence Magazine - February 2023 - 6
IEEE Computational Intelligence Magazine - February 2023 - 7
IEEE Computational Intelligence Magazine - February 2023 - 8
IEEE Computational Intelligence Magazine - February 2023 - 9
IEEE Computational Intelligence Magazine - February 2023 - 10
IEEE Computational Intelligence Magazine - February 2023 - 11
IEEE Computational Intelligence Magazine - February 2023 - 12
IEEE Computational Intelligence Magazine - February 2023 - 13
IEEE Computational Intelligence Magazine - February 2023 - 14
IEEE Computational Intelligence Magazine - February 2023 - 15
IEEE Computational Intelligence Magazine - February 2023 - 16
IEEE Computational Intelligence Magazine - February 2023 - 17
IEEE Computational Intelligence Magazine - February 2023 - 18
IEEE Computational Intelligence Magazine - February 2023 - 19
IEEE Computational Intelligence Magazine - February 2023 - 20
IEEE Computational Intelligence Magazine - February 2023 - 21
IEEE Computational Intelligence Magazine - February 2023 - 22
IEEE Computational Intelligence Magazine - February 2023 - 23
IEEE Computational Intelligence Magazine - February 2023 - 24
IEEE Computational Intelligence Magazine - February 2023 - 25
IEEE Computational Intelligence Magazine - February 2023 - 26
IEEE Computational Intelligence Magazine - February 2023 - 27
IEEE Computational Intelligence Magazine - February 2023 - 28
IEEE Computational Intelligence Magazine - February 2023 - 29
IEEE Computational Intelligence Magazine - February 2023 - 30
IEEE Computational Intelligence Magazine - February 2023 - 31
IEEE Computational Intelligence Magazine - February 2023 - 32
IEEE Computational Intelligence Magazine - February 2023 - 33
IEEE Computational Intelligence Magazine - February 2023 - 34
IEEE Computational Intelligence Magazine - February 2023 - 35
IEEE Computational Intelligence Magazine - February 2023 - 36
IEEE Computational Intelligence Magazine - February 2023 - 37
IEEE Computational Intelligence Magazine - February 2023 - 38
IEEE Computational Intelligence Magazine - February 2023 - 39
IEEE Computational Intelligence Magazine - February 2023 - 40
IEEE Computational Intelligence Magazine - February 2023 - 41
IEEE Computational Intelligence Magazine - February 2023 - 42
IEEE Computational Intelligence Magazine - February 2023 - 43
IEEE Computational Intelligence Magazine - February 2023 - 44
IEEE Computational Intelligence Magazine - February 2023 - 45
IEEE Computational Intelligence Magazine - February 2023 - 46
IEEE Computational Intelligence Magazine - February 2023 - 47
IEEE Computational Intelligence Magazine - February 2023 - 48
IEEE Computational Intelligence Magazine - February 2023 - 49
IEEE Computational Intelligence Magazine - February 2023 - 50
IEEE Computational Intelligence Magazine - February 2023 - 51
IEEE Computational Intelligence Magazine - February 2023 - 52
IEEE Computational Intelligence Magazine - February 2023 - 53
IEEE Computational Intelligence Magazine - February 2023 - 54
IEEE Computational Intelligence Magazine - February 2023 - 55
IEEE Computational Intelligence Magazine - February 2023 - 56
IEEE Computational Intelligence Magazine - February 2023 - 57
IEEE Computational Intelligence Magazine - February 2023 - 58
IEEE Computational Intelligence Magazine - February 2023 - 59
IEEE Computational Intelligence Magazine - February 2023 - 60
IEEE Computational Intelligence Magazine - February 2023 - 61
IEEE Computational Intelligence Magazine - February 2023 - 62
IEEE Computational Intelligence Magazine - February 2023 - 63
IEEE Computational Intelligence Magazine - February 2023 - 64
IEEE Computational Intelligence Magazine - February 2023 - 65
IEEE Computational Intelligence Magazine - February 2023 - 66
IEEE Computational Intelligence Magazine - February 2023 - 67
IEEE Computational Intelligence Magazine - February 2023 - 68
IEEE Computational Intelligence Magazine - February 2023 - 69
IEEE Computational Intelligence Magazine - February 2023 - 70
IEEE Computational Intelligence Magazine - February 2023 - 71
IEEE Computational Intelligence Magazine - February 2023 - 72
IEEE Computational Intelligence Magazine - February 2023 - 73
IEEE Computational Intelligence Magazine - February 2023 - 74
IEEE Computational Intelligence Magazine - February 2023 - 75
IEEE Computational Intelligence Magazine - February 2023 - 76
IEEE Computational Intelligence Magazine - February 2023 - 77
IEEE Computational Intelligence Magazine - February 2023 - 78
IEEE Computational Intelligence Magazine - February 2023 - 79
IEEE Computational Intelligence Magazine - February 2023 - 80
IEEE Computational Intelligence Magazine - February 2023 - 81
IEEE Computational Intelligence Magazine - February 2023 - 82
IEEE Computational Intelligence Magazine - February 2023 - 83
IEEE Computational Intelligence Magazine - February 2023 - 84
IEEE Computational Intelligence Magazine - February 2023 - 85
IEEE Computational Intelligence Magazine - February 2023 - 86
IEEE Computational Intelligence Magazine - February 2023 - 87
IEEE Computational Intelligence Magazine - February 2023 - 88
IEEE Computational Intelligence Magazine - February 2023 - 89
IEEE Computational Intelligence Magazine - February 2023 - 90
IEEE Computational Intelligence Magazine - February 2023 - 91
IEEE Computational Intelligence Magazine - February 2023 - 92
IEEE Computational Intelligence Magazine - February 2023 - 93
IEEE Computational Intelligence Magazine - February 2023 - 94
IEEE Computational Intelligence Magazine - February 2023 - 95
IEEE Computational Intelligence Magazine - February 2023 - 96
IEEE Computational Intelligence Magazine - February 2023 - 97
IEEE Computational Intelligence Magazine - February 2023 - 98
IEEE Computational Intelligence Magazine - February 2023 - 99
IEEE Computational Intelligence Magazine - February 2023 - 100
IEEE Computational Intelligence Magazine - February 2023 - 101
IEEE Computational Intelligence Magazine - February 2023 - 102
IEEE Computational Intelligence Magazine - February 2023 - 103
IEEE Computational Intelligence Magazine - February 2023 - 104
IEEE Computational Intelligence Magazine - February 2023 - Cover3
IEEE Computational Intelligence Magazine - February 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com