IEEE Computational Intelligence Magazine - February 2023 - 37

While general lifelong learning can
accommodatenewknowledge added to
themodel, it does not address the
mechanisms of representation learning,
which is the main component of graph
learning paradigms.
and the tasks are predefined, and there is no mechanism to
accommodate new knowledge that comes sequentially. Lastly,
online learning deals with future data in sequential order but
has a different objective than lifelong learning. Online learning
aims to learn similar and predefined tasks sequentially to
achieve efficiency [60]. On the other hand, lifelong learning
aims to learn new knowledge while retraining previous experience
and using that to help with future tasks.
Another concept of some relevance is out-of-distribution
(OOD) generalization techniques [61], [62]. OOD generalization
aims to create a model that can perform well on test distributions
that differ from training distributions. The target of
that concept is to produce a robust predictor based on invariant
features that still have a similar domain [63]. This is different
from the objectives of lifelong learning, however, which aims
to accommodate new data distributions with different contexts
from existing ones, whilst trying to avoid shifting performance
or forgetting previous knowledge learned from past data
distributions.
C. Relevant Graph Learning Characteristics
In order to utilize graph data in learning algorithms, the first
strategy is to capture the graph representation and convert it
into a meaningful feature representation. This is known as
graph representation learning. It is common in graph learning
problems to assume the graph is static and models a fixed number
of tasks in terms of classes and data instances before the
training process begins. There are, however, specific fields
within graph learning that seek to address dynamic data. Spatio-temporal
graph networks utilize dynamic graph learning
over time to account for dynamic temporal attributes where
the structure of the graph remains static. Yu et al. [53] proposed
a novel framework called spatio-temporal graph convolutional
networks (STGCN) to develop time-series prediction
in the traffic domain. There are two essential components in
this type of spatio-temporal graph. The first is spatial forms
that maintain the static structure of the graph (e.g., in [53] the
traffic network). The second component is temporal features
that describe different values of relations between nodes that
change over time (e.g., sensor readings/values at the node).
The objective of this method is to perform convolutional
operations on a graph to utilize spatial information to forecast
the graph edges' temporal features. Guo et al. [55] developed a
novel attention-based spatial-temporal graph convolutional
network (ASTGCN) to address the same objective offorecasting
traffic situations. The main contribution of the ASTGCN
model is to employ spatial attention to consider the influence
of other nodes on traffic conditions. This method uses the
same approach of spatial-temporal convolutions to obtain
valuable information on the spatial form and temporal features
graph. Although those methods incorporate dynamic temporal
attributes in the graph, they have different characteristics from
the concept of lifelong learning. The embedding processes on
those methods are carried out sequentially at different timeframes,
and there is no knowledge transfer mechanism when a
new node representation appears in the graph. Moreover, spatio-temporal
neural network techniques have predefined and
fixed tasks that do not consider accommodating new knowledge,
so the concept ofaddressing the problem of catastrophic
forgetting is not discussed in this learning concept.
Another graph learning approach that shares some similarities
with graph lifelong learning is dynamic graph learning
which can be called temporal graph embedding [64]. Dynamic
graph learning can be divided into two graph types, a discretetime
evolving graph which can be represented as a collection
of dynamic graph snapshots at different time steps, and a continuous-time
evolving graph, which carries more information
and complexity and is defined as a graph stream instead of
snapshots [65]. Some works address discrete-time evolving
graphs, such as a work by Goyal et al. [40] called DynGEM,
based on deep autoencoders that aim to perform dynamic
embedding in graphs in discrete time with snapshots.
DynGEM uses the learned embedding from a previous time
step to initialize the current time step embedding so as to transfer
knowledge and ensure the embedding remains close to
that of the previous time step. Zhou et al. [42] proposed
a novel dynamic network embedding approach called
DynamicTriad, which aims to learn a representation of
dynamicgraphsbyimposingatriad(groupofthree vertices)
to learn the representation while also preserving temporal
information. These dynamic graph learning methods
share graph lifelong learning's goal of transferring past
knowledge to produce a good representation of consolidatedknowledge
in thenexttimesteptohelpthe model
learn efficiently.
For continuous-time-evolving graphs, Ma et al. [66] proposed
a method called streaming graph neural networks
(SGNN) that aims to do strict representation learning on continuous
dynamic node representation. The architecture of
SGNN consists of two components, an update component to
modify the interaction information ofnew relationships and a
propagation component to propagate the update to the
involved or relevant neighborhood based on a time-aware
long-short-term memory (LSTM). These dynamic graph
learning approaches for both discrete and continuous networks
aim to perform representation learning based on every change
in graph snapshots or graph streams. However, there are some
noticeable differences between dynamic graph learning and
graph lifelong learning. Firstly, dynamic graph learning only
FEBRUARY 2023 | IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE 37

IEEE Computational Intelligence Magazine - February 2023

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - February 2023

Contents
IEEE Computational Intelligence Magazine - February 2023 - Cover1
IEEE Computational Intelligence Magazine - February 2023 - Cover2
IEEE Computational Intelligence Magazine - February 2023 - Contents
IEEE Computational Intelligence Magazine - February 2023 - 2
IEEE Computational Intelligence Magazine - February 2023 - 3
IEEE Computational Intelligence Magazine - February 2023 - 4
IEEE Computational Intelligence Magazine - February 2023 - 5
IEEE Computational Intelligence Magazine - February 2023 - 6
IEEE Computational Intelligence Magazine - February 2023 - 7
IEEE Computational Intelligence Magazine - February 2023 - 8
IEEE Computational Intelligence Magazine - February 2023 - 9
IEEE Computational Intelligence Magazine - February 2023 - 10
IEEE Computational Intelligence Magazine - February 2023 - 11
IEEE Computational Intelligence Magazine - February 2023 - 12
IEEE Computational Intelligence Magazine - February 2023 - 13
IEEE Computational Intelligence Magazine - February 2023 - 14
IEEE Computational Intelligence Magazine - February 2023 - 15
IEEE Computational Intelligence Magazine - February 2023 - 16
IEEE Computational Intelligence Magazine - February 2023 - 17
IEEE Computational Intelligence Magazine - February 2023 - 18
IEEE Computational Intelligence Magazine - February 2023 - 19
IEEE Computational Intelligence Magazine - February 2023 - 20
IEEE Computational Intelligence Magazine - February 2023 - 21
IEEE Computational Intelligence Magazine - February 2023 - 22
IEEE Computational Intelligence Magazine - February 2023 - 23
IEEE Computational Intelligence Magazine - February 2023 - 24
IEEE Computational Intelligence Magazine - February 2023 - 25
IEEE Computational Intelligence Magazine - February 2023 - 26
IEEE Computational Intelligence Magazine - February 2023 - 27
IEEE Computational Intelligence Magazine - February 2023 - 28
IEEE Computational Intelligence Magazine - February 2023 - 29
IEEE Computational Intelligence Magazine - February 2023 - 30
IEEE Computational Intelligence Magazine - February 2023 - 31
IEEE Computational Intelligence Magazine - February 2023 - 32
IEEE Computational Intelligence Magazine - February 2023 - 33
IEEE Computational Intelligence Magazine - February 2023 - 34
IEEE Computational Intelligence Magazine - February 2023 - 35
IEEE Computational Intelligence Magazine - February 2023 - 36
IEEE Computational Intelligence Magazine - February 2023 - 37
IEEE Computational Intelligence Magazine - February 2023 - 38
IEEE Computational Intelligence Magazine - February 2023 - 39
IEEE Computational Intelligence Magazine - February 2023 - 40
IEEE Computational Intelligence Magazine - February 2023 - 41
IEEE Computational Intelligence Magazine - February 2023 - 42
IEEE Computational Intelligence Magazine - February 2023 - 43
IEEE Computational Intelligence Magazine - February 2023 - 44
IEEE Computational Intelligence Magazine - February 2023 - 45
IEEE Computational Intelligence Magazine - February 2023 - 46
IEEE Computational Intelligence Magazine - February 2023 - 47
IEEE Computational Intelligence Magazine - February 2023 - 48
IEEE Computational Intelligence Magazine - February 2023 - 49
IEEE Computational Intelligence Magazine - February 2023 - 50
IEEE Computational Intelligence Magazine - February 2023 - 51
IEEE Computational Intelligence Magazine - February 2023 - 52
IEEE Computational Intelligence Magazine - February 2023 - 53
IEEE Computational Intelligence Magazine - February 2023 - 54
IEEE Computational Intelligence Magazine - February 2023 - 55
IEEE Computational Intelligence Magazine - February 2023 - 56
IEEE Computational Intelligence Magazine - February 2023 - 57
IEEE Computational Intelligence Magazine - February 2023 - 58
IEEE Computational Intelligence Magazine - February 2023 - 59
IEEE Computational Intelligence Magazine - February 2023 - 60
IEEE Computational Intelligence Magazine - February 2023 - 61
IEEE Computational Intelligence Magazine - February 2023 - 62
IEEE Computational Intelligence Magazine - February 2023 - 63
IEEE Computational Intelligence Magazine - February 2023 - 64
IEEE Computational Intelligence Magazine - February 2023 - 65
IEEE Computational Intelligence Magazine - February 2023 - 66
IEEE Computational Intelligence Magazine - February 2023 - 67
IEEE Computational Intelligence Magazine - February 2023 - 68
IEEE Computational Intelligence Magazine - February 2023 - 69
IEEE Computational Intelligence Magazine - February 2023 - 70
IEEE Computational Intelligence Magazine - February 2023 - 71
IEEE Computational Intelligence Magazine - February 2023 - 72
IEEE Computational Intelligence Magazine - February 2023 - 73
IEEE Computational Intelligence Magazine - February 2023 - 74
IEEE Computational Intelligence Magazine - February 2023 - 75
IEEE Computational Intelligence Magazine - February 2023 - 76
IEEE Computational Intelligence Magazine - February 2023 - 77
IEEE Computational Intelligence Magazine - February 2023 - 78
IEEE Computational Intelligence Magazine - February 2023 - 79
IEEE Computational Intelligence Magazine - February 2023 - 80
IEEE Computational Intelligence Magazine - February 2023 - 81
IEEE Computational Intelligence Magazine - February 2023 - 82
IEEE Computational Intelligence Magazine - February 2023 - 83
IEEE Computational Intelligence Magazine - February 2023 - 84
IEEE Computational Intelligence Magazine - February 2023 - 85
IEEE Computational Intelligence Magazine - February 2023 - 86
IEEE Computational Intelligence Magazine - February 2023 - 87
IEEE Computational Intelligence Magazine - February 2023 - 88
IEEE Computational Intelligence Magazine - February 2023 - 89
IEEE Computational Intelligence Magazine - February 2023 - 90
IEEE Computational Intelligence Magazine - February 2023 - 91
IEEE Computational Intelligence Magazine - February 2023 - 92
IEEE Computational Intelligence Magazine - February 2023 - 93
IEEE Computational Intelligence Magazine - February 2023 - 94
IEEE Computational Intelligence Magazine - February 2023 - 95
IEEE Computational Intelligence Magazine - February 2023 - 96
IEEE Computational Intelligence Magazine - February 2023 - 97
IEEE Computational Intelligence Magazine - February 2023 - 98
IEEE Computational Intelligence Magazine - February 2023 - 99
IEEE Computational Intelligence Magazine - February 2023 - 100
IEEE Computational Intelligence Magazine - February 2023 - 101
IEEE Computational Intelligence Magazine - February 2023 - 102
IEEE Computational Intelligence Magazine - February 2023 - 103
IEEE Computational Intelligence Magazine - February 2023 - 104
IEEE Computational Intelligence Magazine - February 2023 - Cover3
IEEE Computational Intelligence Magazine - February 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com