IEEE Computational Intelligence Magazine - February 2023 - 43

In graph lifelong learning, consolidating
knowledge is necessary to alleviate
forgetting problems pertaining to
changes in the graph representation in
terms of nodes, relations, and tasks.
performance of the method. First, as a lower bound, finetuning
is implemented to train the data continuously without
employing lifelong learning. Second, as the upper bound, all
nodes are retrained from scratch when new information
appears. Moreover, there are some general lifelong learning
that does not consider graph representation learning, such as
EWC [19], EMR [16], and GEM [23]. For link prediction,
DiCGRL achieves the maximum average performance ofH@
10 (average top-10 ranked entities) on the FB15K-237 dataset
with 47.7%. The highest average accuracy on the Pubmed
dataset for node classification is 85%. DiCGRL achieves an
improvement compared to the general lifelong learning techniques
and tends to have a close result with the upper bound
baselines.
B. Graph Few-Shot Class-Incremental Learning
The ability to learn new classes incrementally is essential in
real-world scenarios. Zhen et al. [69] proposed GPIL, which
implements lifelong learning in graphs to learn both newly
encountered and previous classes. To enable a lifelong
learning setting, firstly, the model spans the dataset into two
parts, i.e., base classes and pseudo novel classes, with disjoint
label spaces. Then, Graph FGL recurrently pre-trains the
encoderonthe base classesand keepsitduringthe pseudo
incremental learning process. Then it performs meta-learning
to learn initialization with more transferable metaknowledge
with a novel pseudo incremental learning,
GPIL. During each meta-training episode, All few-shot
classification tasks are sampled from pseudo-novel classes
and base classes to imitate the incremental process in the
evaluation process. By doing that, each meta-learning episode
is able to learn a transferable model initialization for
the incremental learning phase. In order to minimize the
catastrophic forgetting problem, this method proposes a
Hierarchical Attention Graph Meta-learning framework
(HAG-Meta) by using the regularization technique to modify
the loss function during the learning process. It uses a
dynamically scaled loss regularizer that utilises the scale factor
[95], [96] to multiple task levels to adjust their contribution
to model training. That scaling factor helps to reduce
the contribution of insignificant classes while maximizing
the important tasks. Moreover, This method proposed two
hierarchical attention modules: 1) tasks-level attention,
which estimates the importance of each task to balance the
contribution of different tasks, and 2) node-level attention,
which maintains a better balance between prior knowledge
and new knowledge within the nodes. The model claims to
have excellent stability in consolidating knowledge and
gaining advantageous adaptability to new knowledge with
minimal data samples.
The experiment conducted on GPIL uses a few-shot learning
case study with graphs offew-shot learning datasets such as
DBLP [88], Amazon-clothing [97], and Reddit [35]. It uses a
split strategy for the class into 3 parts pre-training, meta-training,
and evaluation. GPIL is compared with several methods
such as continual learning on graphs, ER-GNN [9], standard
continual learning ICARL [51], which uses modifications
with GNN, Few-shot Class-incremental Learning (CEC) [98],
etc. The experiment calculates the performance dropping rate
(PD) that measures the accuracy drop from the last sessions.
There are 10 sessions in every experiment of each baseline
method. GPIL achieve the lowest performance dropping rate
(PD) in the DBLP dataset with 17.11%. It is better than other
baseline methods in terms of minimizing performance
degradation.
C. Topology-Aware Weight Preserving
The TWP method by Liu et al. [70] proposed a novel
method to strengthen lifelong learning and minimize catastrophic
forgetting in GNNs. TWP explicitly studies the
local structure of the input graph and stabilizes the important
parameters in topological aggregation. Given an input
graph and its embedding feature of nodes, TWP estimates
the important score of each network parameter based on
their contribution to the topological structure and taskrelated
performance. The methods used in TWP are to calculate
the gradients of the task-wise objective and topological
preserving with each parameter, then consider the
gradient as an index for the parameter importance. After
learning previous tasks, the model gets the optimal parameters
by minimizing the loss on the task. As not all parameters
contribute equally, it is important to preserve the
minimized loss by considering highly influential parameters.
Approximate contributions of each parameter are calculated
basedonaninfinitesimal change in each parameter, like in
the general lifelong learning method of Synaptic Intelligence
(SI) [99]. Parameters that significantly contribute to
minimizing loss must be kept stable when learning future
tasks. Besides minimized loss preserving, topological structure
preservation is conducted since structure information in
the graph plays an important role. It aims to find the parameterswitha
substantial contributiontolearningthe topological
information of the graph. GAT [78] are employed to
learn topological information around the center node by
calculating the attention coefficient of its neighbor. With
the graph modeling process by GAT, then, the infinitesimal
changes of each embedding feature are calculated to get the
importance scores of each parameter. The regularization
approach accommodates the new task by minimizing the
performance degradation of the previous task. TWP penalizes
changes in important parameters for old knowledge by
FEBRUARY 2023 | IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE 43

IEEE Computational Intelligence Magazine - February 2023

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - February 2023

Contents
IEEE Computational Intelligence Magazine - February 2023 - Cover1
IEEE Computational Intelligence Magazine - February 2023 - Cover2
IEEE Computational Intelligence Magazine - February 2023 - Contents
IEEE Computational Intelligence Magazine - February 2023 - 2
IEEE Computational Intelligence Magazine - February 2023 - 3
IEEE Computational Intelligence Magazine - February 2023 - 4
IEEE Computational Intelligence Magazine - February 2023 - 5
IEEE Computational Intelligence Magazine - February 2023 - 6
IEEE Computational Intelligence Magazine - February 2023 - 7
IEEE Computational Intelligence Magazine - February 2023 - 8
IEEE Computational Intelligence Magazine - February 2023 - 9
IEEE Computational Intelligence Magazine - February 2023 - 10
IEEE Computational Intelligence Magazine - February 2023 - 11
IEEE Computational Intelligence Magazine - February 2023 - 12
IEEE Computational Intelligence Magazine - February 2023 - 13
IEEE Computational Intelligence Magazine - February 2023 - 14
IEEE Computational Intelligence Magazine - February 2023 - 15
IEEE Computational Intelligence Magazine - February 2023 - 16
IEEE Computational Intelligence Magazine - February 2023 - 17
IEEE Computational Intelligence Magazine - February 2023 - 18
IEEE Computational Intelligence Magazine - February 2023 - 19
IEEE Computational Intelligence Magazine - February 2023 - 20
IEEE Computational Intelligence Magazine - February 2023 - 21
IEEE Computational Intelligence Magazine - February 2023 - 22
IEEE Computational Intelligence Magazine - February 2023 - 23
IEEE Computational Intelligence Magazine - February 2023 - 24
IEEE Computational Intelligence Magazine - February 2023 - 25
IEEE Computational Intelligence Magazine - February 2023 - 26
IEEE Computational Intelligence Magazine - February 2023 - 27
IEEE Computational Intelligence Magazine - February 2023 - 28
IEEE Computational Intelligence Magazine - February 2023 - 29
IEEE Computational Intelligence Magazine - February 2023 - 30
IEEE Computational Intelligence Magazine - February 2023 - 31
IEEE Computational Intelligence Magazine - February 2023 - 32
IEEE Computational Intelligence Magazine - February 2023 - 33
IEEE Computational Intelligence Magazine - February 2023 - 34
IEEE Computational Intelligence Magazine - February 2023 - 35
IEEE Computational Intelligence Magazine - February 2023 - 36
IEEE Computational Intelligence Magazine - February 2023 - 37
IEEE Computational Intelligence Magazine - February 2023 - 38
IEEE Computational Intelligence Magazine - February 2023 - 39
IEEE Computational Intelligence Magazine - February 2023 - 40
IEEE Computational Intelligence Magazine - February 2023 - 41
IEEE Computational Intelligence Magazine - February 2023 - 42
IEEE Computational Intelligence Magazine - February 2023 - 43
IEEE Computational Intelligence Magazine - February 2023 - 44
IEEE Computational Intelligence Magazine - February 2023 - 45
IEEE Computational Intelligence Magazine - February 2023 - 46
IEEE Computational Intelligence Magazine - February 2023 - 47
IEEE Computational Intelligence Magazine - February 2023 - 48
IEEE Computational Intelligence Magazine - February 2023 - 49
IEEE Computational Intelligence Magazine - February 2023 - 50
IEEE Computational Intelligence Magazine - February 2023 - 51
IEEE Computational Intelligence Magazine - February 2023 - 52
IEEE Computational Intelligence Magazine - February 2023 - 53
IEEE Computational Intelligence Magazine - February 2023 - 54
IEEE Computational Intelligence Magazine - February 2023 - 55
IEEE Computational Intelligence Magazine - February 2023 - 56
IEEE Computational Intelligence Magazine - February 2023 - 57
IEEE Computational Intelligence Magazine - February 2023 - 58
IEEE Computational Intelligence Magazine - February 2023 - 59
IEEE Computational Intelligence Magazine - February 2023 - 60
IEEE Computational Intelligence Magazine - February 2023 - 61
IEEE Computational Intelligence Magazine - February 2023 - 62
IEEE Computational Intelligence Magazine - February 2023 - 63
IEEE Computational Intelligence Magazine - February 2023 - 64
IEEE Computational Intelligence Magazine - February 2023 - 65
IEEE Computational Intelligence Magazine - February 2023 - 66
IEEE Computational Intelligence Magazine - February 2023 - 67
IEEE Computational Intelligence Magazine - February 2023 - 68
IEEE Computational Intelligence Magazine - February 2023 - 69
IEEE Computational Intelligence Magazine - February 2023 - 70
IEEE Computational Intelligence Magazine - February 2023 - 71
IEEE Computational Intelligence Magazine - February 2023 - 72
IEEE Computational Intelligence Magazine - February 2023 - 73
IEEE Computational Intelligence Magazine - February 2023 - 74
IEEE Computational Intelligence Magazine - February 2023 - 75
IEEE Computational Intelligence Magazine - February 2023 - 76
IEEE Computational Intelligence Magazine - February 2023 - 77
IEEE Computational Intelligence Magazine - February 2023 - 78
IEEE Computational Intelligence Magazine - February 2023 - 79
IEEE Computational Intelligence Magazine - February 2023 - 80
IEEE Computational Intelligence Magazine - February 2023 - 81
IEEE Computational Intelligence Magazine - February 2023 - 82
IEEE Computational Intelligence Magazine - February 2023 - 83
IEEE Computational Intelligence Magazine - February 2023 - 84
IEEE Computational Intelligence Magazine - February 2023 - 85
IEEE Computational Intelligence Magazine - February 2023 - 86
IEEE Computational Intelligence Magazine - February 2023 - 87
IEEE Computational Intelligence Magazine - February 2023 - 88
IEEE Computational Intelligence Magazine - February 2023 - 89
IEEE Computational Intelligence Magazine - February 2023 - 90
IEEE Computational Intelligence Magazine - February 2023 - 91
IEEE Computational Intelligence Magazine - February 2023 - 92
IEEE Computational Intelligence Magazine - February 2023 - 93
IEEE Computational Intelligence Magazine - February 2023 - 94
IEEE Computational Intelligence Magazine - February 2023 - 95
IEEE Computational Intelligence Magazine - February 2023 - 96
IEEE Computational Intelligence Magazine - February 2023 - 97
IEEE Computational Intelligence Magazine - February 2023 - 98
IEEE Computational Intelligence Magazine - February 2023 - 99
IEEE Computational Intelligence Magazine - February 2023 - 100
IEEE Computational Intelligence Magazine - February 2023 - 101
IEEE Computational Intelligence Magazine - February 2023 - 102
IEEE Computational Intelligence Magazine - February 2023 - 103
IEEE Computational Intelligence Magazine - February 2023 - 104
IEEE Computational Intelligence Magazine - February 2023 - Cover3
IEEE Computational Intelligence Magazine - February 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com