IEEE Computational Intelligence Magazine - February 2023 - 44

regularizing the loss function while learning a new task.
When changing high-importance parameters is penalized,
themodel canstill performwellonprevioustasks. The loss
function for new tasks Tkþ1 formulated as:
L0
kþ1ðWÞ¼ Lnew
where Lnew
kþ1 þ
Xk
n¼1
InðWW
n Þ2;
(7)
kþ1 is a new loss function, e.g., cross-entropy based on
the task. In denotes the network importance for the old tasks.
means element-wise multiplication and W
n
contains the
optimal parameter for task Tn. That learning strategy ensures
that low-important score parameters can be adjusted freely
while high-importance parameters will be maintained and
penalized so that the model can handle the previous tasks.
TWP also employs a strategy to promote the minimization of
the computed importance score to maintain the model
plasticity.
One ofthe experiments on this method uses a node classification
case study. Some datasets are used on node classifications,
such as Corafull [100], amazon computer [97], protein-protein
interaction (PPI) [4] and Reddit [35]. Three backbones of
GNN are used to perform representation learning, such as
GAT [78], GCN [27], and GIN [80], which are connected
using several general lifelong learning such as LWF [20],
GEM [23], EWC [19], etc. TWP got the highest average performance
with 0.976 and got the lowest average forgetting
score of 0.001 0.062 on the Reddit dataset. TWP demonstrates
the effectiveness and ability to perform lifelong learning
in the graph domain compared to other standard lifelong learning
methods.
D. Translation-Based Knowledge Graph Embedding
Song and Park [71] developed a translation-based knowledge
graph embedding method through continual learning that
aims to enrich the representation of knowledge graphs when
incoming representation appears in graph data. Moreover, this
method minimizes the risk over new triples using rapid parameters
that are penalized between the old and new embedding
models. The translation-based knowledge graph embeddings
aim to generate a vector representation in the embedding
space of the knowledge graph based on the entity relation
information in a graph. The simplest model in the translationbased
knowledge graph embedding is TransE [90], which
forms the embedding matrix of entities and relations in the
graph. In that case, the parameters to create the embedding are
optimized using stochastic gradient descent.
The assumption is that the knowledge graph embedding
method [89], [90] is those graph entities whose relations are
fixed after the training. This method employs the regularization
term in order to accommodate the new embedding representation
and preserves the representation of previously
generated embedding. The underlying concept ofthe regularization
approach in this model is to minimize empirical risk
Rn
empðunÞ defined as:
44 IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | FEBRUARY 2023
Rn
empðunÞ¼ Rn
where Rn
empðDn; unÞþnfðun; un1Þ;
(8)
empðDn; un) denotes the empirical risk of new data.
is a hyperparameter that maintains
fðun; un1Þ is as regularization term to penalize new parameter
so that un close to un1.n
how much un retrains the knowledge from previous data. As a
result, the method can achieve the objective by minimizing
risk penalized by rapid representation changes, and the old
experiences are preserved in a new model.
One of the experiments with this method is in a knowledge
graph embedding scenario with two knowledge graph
datasets: WN18 (WordNet) [93] and FB15 k (Freebase) [92].
New instances on nodes and relations are set to arrive sequentially.
There are several baseline models used for comparison:
1) join learning as an upper bound, 2) finetuning as a lower
bound that accommodates the new information continuously
without any knowledge consolidation, and 3) general lifelong
learning, EWC [19]. The highest result of the experiment
achieved 94.27 of H @10 (average top-10 ranked entities)
score in the tripe classification task on the WN18 dataset. It is
better than EWC and finetuning methods and almost equal to
the upper bound joint learning technique.
VI. HYBRID APPROACH
The hybrid approach combines more than one lifelong learning
approach to take advantage ofeach approach and maximize the
performance ofmodels [23], [24], [101]. Examples ofthe graph
lifelong learning models that implement hybrid approaches discussed
in this section include ContinualGNN[10], LDANE [72],
and TrafficStream [44].
A. ContinualGNN
In the actual scenario model of ContinualGNN proposed by
Wang et al. [10], there is a concept of replaying strategy to
refine the network as a complement to the regularization
approach itself. It combines both approaches to mitigate catastrophic
forgetting and maintain the existing learned pattern.
The first main goal of this model is to detect a new pattern in
graph structure that significantly influences all nodes in the
network. Second, the model aims to consolidate the knowledge
in the entire network. The ContinualGNN model captures
new patterns in streaming graph data. The model
proposes a method based on a propagation process to efficiently
mine the information on affected nodes when learning
new patterns. The existing knowledge from the propagation
process is then maintained using a combination of both
approaches ofrehearsal and regularization strategies.
During the detection process for new patterns, ContinualGNN
considers the response to the changes by looking at
the number of changes in the networks. It is insufficient to
consider new patterns when the changes are too small, for
example, by only adding an edge in the network while the
other neighborhood in the graph is very stable. Retraining
that scenario is costly and a waste of time. When nodes'

IEEE Computational Intelligence Magazine - February 2023

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - February 2023

Contents
IEEE Computational Intelligence Magazine - February 2023 - Cover1
IEEE Computational Intelligence Magazine - February 2023 - Cover2
IEEE Computational Intelligence Magazine - February 2023 - Contents
IEEE Computational Intelligence Magazine - February 2023 - 2
IEEE Computational Intelligence Magazine - February 2023 - 3
IEEE Computational Intelligence Magazine - February 2023 - 4
IEEE Computational Intelligence Magazine - February 2023 - 5
IEEE Computational Intelligence Magazine - February 2023 - 6
IEEE Computational Intelligence Magazine - February 2023 - 7
IEEE Computational Intelligence Magazine - February 2023 - 8
IEEE Computational Intelligence Magazine - February 2023 - 9
IEEE Computational Intelligence Magazine - February 2023 - 10
IEEE Computational Intelligence Magazine - February 2023 - 11
IEEE Computational Intelligence Magazine - February 2023 - 12
IEEE Computational Intelligence Magazine - February 2023 - 13
IEEE Computational Intelligence Magazine - February 2023 - 14
IEEE Computational Intelligence Magazine - February 2023 - 15
IEEE Computational Intelligence Magazine - February 2023 - 16
IEEE Computational Intelligence Magazine - February 2023 - 17
IEEE Computational Intelligence Magazine - February 2023 - 18
IEEE Computational Intelligence Magazine - February 2023 - 19
IEEE Computational Intelligence Magazine - February 2023 - 20
IEEE Computational Intelligence Magazine - February 2023 - 21
IEEE Computational Intelligence Magazine - February 2023 - 22
IEEE Computational Intelligence Magazine - February 2023 - 23
IEEE Computational Intelligence Magazine - February 2023 - 24
IEEE Computational Intelligence Magazine - February 2023 - 25
IEEE Computational Intelligence Magazine - February 2023 - 26
IEEE Computational Intelligence Magazine - February 2023 - 27
IEEE Computational Intelligence Magazine - February 2023 - 28
IEEE Computational Intelligence Magazine - February 2023 - 29
IEEE Computational Intelligence Magazine - February 2023 - 30
IEEE Computational Intelligence Magazine - February 2023 - 31
IEEE Computational Intelligence Magazine - February 2023 - 32
IEEE Computational Intelligence Magazine - February 2023 - 33
IEEE Computational Intelligence Magazine - February 2023 - 34
IEEE Computational Intelligence Magazine - February 2023 - 35
IEEE Computational Intelligence Magazine - February 2023 - 36
IEEE Computational Intelligence Magazine - February 2023 - 37
IEEE Computational Intelligence Magazine - February 2023 - 38
IEEE Computational Intelligence Magazine - February 2023 - 39
IEEE Computational Intelligence Magazine - February 2023 - 40
IEEE Computational Intelligence Magazine - February 2023 - 41
IEEE Computational Intelligence Magazine - February 2023 - 42
IEEE Computational Intelligence Magazine - February 2023 - 43
IEEE Computational Intelligence Magazine - February 2023 - 44
IEEE Computational Intelligence Magazine - February 2023 - 45
IEEE Computational Intelligence Magazine - February 2023 - 46
IEEE Computational Intelligence Magazine - February 2023 - 47
IEEE Computational Intelligence Magazine - February 2023 - 48
IEEE Computational Intelligence Magazine - February 2023 - 49
IEEE Computational Intelligence Magazine - February 2023 - 50
IEEE Computational Intelligence Magazine - February 2023 - 51
IEEE Computational Intelligence Magazine - February 2023 - 52
IEEE Computational Intelligence Magazine - February 2023 - 53
IEEE Computational Intelligence Magazine - February 2023 - 54
IEEE Computational Intelligence Magazine - February 2023 - 55
IEEE Computational Intelligence Magazine - February 2023 - 56
IEEE Computational Intelligence Magazine - February 2023 - 57
IEEE Computational Intelligence Magazine - February 2023 - 58
IEEE Computational Intelligence Magazine - February 2023 - 59
IEEE Computational Intelligence Magazine - February 2023 - 60
IEEE Computational Intelligence Magazine - February 2023 - 61
IEEE Computational Intelligence Magazine - February 2023 - 62
IEEE Computational Intelligence Magazine - February 2023 - 63
IEEE Computational Intelligence Magazine - February 2023 - 64
IEEE Computational Intelligence Magazine - February 2023 - 65
IEEE Computational Intelligence Magazine - February 2023 - 66
IEEE Computational Intelligence Magazine - February 2023 - 67
IEEE Computational Intelligence Magazine - February 2023 - 68
IEEE Computational Intelligence Magazine - February 2023 - 69
IEEE Computational Intelligence Magazine - February 2023 - 70
IEEE Computational Intelligence Magazine - February 2023 - 71
IEEE Computational Intelligence Magazine - February 2023 - 72
IEEE Computational Intelligence Magazine - February 2023 - 73
IEEE Computational Intelligence Magazine - February 2023 - 74
IEEE Computational Intelligence Magazine - February 2023 - 75
IEEE Computational Intelligence Magazine - February 2023 - 76
IEEE Computational Intelligence Magazine - February 2023 - 77
IEEE Computational Intelligence Magazine - February 2023 - 78
IEEE Computational Intelligence Magazine - February 2023 - 79
IEEE Computational Intelligence Magazine - February 2023 - 80
IEEE Computational Intelligence Magazine - February 2023 - 81
IEEE Computational Intelligence Magazine - February 2023 - 82
IEEE Computational Intelligence Magazine - February 2023 - 83
IEEE Computational Intelligence Magazine - February 2023 - 84
IEEE Computational Intelligence Magazine - February 2023 - 85
IEEE Computational Intelligence Magazine - February 2023 - 86
IEEE Computational Intelligence Magazine - February 2023 - 87
IEEE Computational Intelligence Magazine - February 2023 - 88
IEEE Computational Intelligence Magazine - February 2023 - 89
IEEE Computational Intelligence Magazine - February 2023 - 90
IEEE Computational Intelligence Magazine - February 2023 - 91
IEEE Computational Intelligence Magazine - February 2023 - 92
IEEE Computational Intelligence Magazine - February 2023 - 93
IEEE Computational Intelligence Magazine - February 2023 - 94
IEEE Computational Intelligence Magazine - February 2023 - 95
IEEE Computational Intelligence Magazine - February 2023 - 96
IEEE Computational Intelligence Magazine - February 2023 - 97
IEEE Computational Intelligence Magazine - February 2023 - 98
IEEE Computational Intelligence Magazine - February 2023 - 99
IEEE Computational Intelligence Magazine - February 2023 - 100
IEEE Computational Intelligence Magazine - February 2023 - 101
IEEE Computational Intelligence Magazine - February 2023 - 102
IEEE Computational Intelligence Magazine - February 2023 - 103
IEEE Computational Intelligence Magazine - February 2023 - 104
IEEE Computational Intelligence Magazine - February 2023 - Cover3
IEEE Computational Intelligence Magazine - February 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com