IEEE Computational Intelligence Magazine - August 2022 - 36

selection problems and selecting only a
small proportion of elements can lead to
good objective values (e.g., support in
pattern mining and total return in portfolio
optimization). As a consequence,
large-scale sparse MOPs widely exist in
the real world, where a list of some popular
applications is presented in [22].
The difficulties of large-scale sparse
MOPs mainly lie in the high-dimensional
decision spaces and the relatively
expensive function evaluations. Owing
to the stochastic search strategies of
MOEAs, they will suffer from the curse
of dimensionality when solving largescale
MOPs, where much more function
evaluations are required to search in
a higher-dimensional decision space
[27]. Considering the expensiveness of
the function evaluations, it is unaffordable
and impractical for conventional
MOEAs to solve large-scale sparse
MOPs properly, and novel search strategies
should be customized to save the
function evaluations.
B. MOEAs for Large-Scale
MOPs and Sparse MOPs
The search strategies in existing largescale
MOEAs are mainly based on
three ideas, i.e., variable grouping,
dimensionality reduction, and novel
variation operators. The variable
grouping based MOEAs aim to divide
the decision variables into multiple
groups and optimize each group alternately,
so that the high-dimensional
decision space is converted into several
low-dimensional spaces. These
MOEAs suggest different grouping
strategies to strike a balance between
efficiency, convergence, and diversity,
such as
the random grouping in
CCGDE3 [2], the control variable
analysis in MOEA/DVA [5], the differential
grouping in CCLSM [28], and
the variable clustering in LMEA [6].
The dimensionality reduction based
MOEAs directly reduce the dimensionality
of the decision space, thus
quickly navigating to optimal subspaces
and accelerating the convergence
speed. These MOEAs learn optimal
subspaces from the current population
via different strategies, such as the
problem transformation in WOF [7],
the random embedding in ReMO [9],
the problem reformulation in LSMOF
[8], and the principal component analysis
in PCA-MOEA [10]. The novel
variation operator based MOEAs
enhance the search ability of conventional
variation operators (e.g., those in
genetic algorithm [29], differential
evolution [30], and particle swarm
optimization [31]) by proposing new
operators, such as the Gaussian process
based inverse model in IM-MOEA
[32], the competitive swarm optimizer
in LMOCSO [11], the adaptive offspring
generation in DGEA [12], and
the covariance matrix adaptation evolution
strategy in S3-CMA-ES [14].
Although these large-scale MOEAs
can be employed to solve large-scale
sparse MOPs in theory, it is difficult for
them to directly find the exact optimal
values of most decision variables (i.e.,
zero) due to the stochastic search paradigm.
On the contrary, an algorithm is
required to find the decision variables
that should be zero and optimize the
other decision variables. For this aim,
SparseEA [20] suggests a bi-level
encoding scheme to represent the solutions
for sparse MOPs, which includes a
binary vector (, ,)
xb xb xb f denot=
12
ing whether each decision variable
should be zero and a real vector
xr (, ,)xr xr
each decision variable, where the decision
variables
12
,,
ations are obtained by
ii ibxri
#
xx ,, ,, .== f12 D
xx for function evalu(3)
By
optimizing the real vector and
binary vector with different genetic
operators, SparseEA can search for the
zero decision variables and optimize the
other decision variables simultaneously.
Moreover, SparseEA generates the binary
vectors through a new population
initialization strategy, a crossover operator,
and a mutation operator, which can
maintain the sparsity of solutions. To
further improve the efficiency, MOEA/
PSL [22] adopts two unsupervised neural
networks (i.e., restricted Boltzmann
machine [33] and denoising autoencoder
[34]) to reduce the dimensionality of
36 IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | AUGUST 2022
= 12 f denoting the value of
the binary vector and the real vector,
respectively. Hence, a sparse distribution
and a compact representation of the
decision variables can be learnt from the
current population. Besides, MOEA/
PSL is equipped with a parameter adaptation
strategy for automatically determining
the parameters in training the
neural networks. Similarly, PM-MOEA
[27] suggests an evolutionary pattern
mining approach to reduce the dimensionality
of the binary vector, which is
parameterless and provides better diversity
than neural networks. PM-MOEA
also proposes an unbalanced crossover
operator and an unbalanced mutation
operator, using different probabilities to
flip the binary variables in xb to ensure
the sparsity of offspring solutions. In
addition to the above three MOEAs,
some work also considers the sparsity of
solutions in multimodal optimization
[35] and expensive optimization [21] in
recent years.
To sum up, large-scale MOEAs converge
faster than conventional MOEAs
due to the grouping of variables, reduction
of decision space, and novel variation
operators. However, for sparse
MOPs with real variables (e.g., neural
network training and portfolio optimization),
large-scale MOEAs can hardly
find the optimal values of zeros. For
sparse MOPs with binary variables
(e.g., patterning mining and critical
node detection), they are not applicable
since the search strategies tailored for
large-scale optimization can only work
in continuous
sparse MOEAs can easily generate
sparse solutions with real or binary
variables, but they are ineffective for
solving the MOPs without sparse Pareto
optimal solutions. The applicability
of some representative MOEAs to
large-scale sparse MOPs is summarized
in Table I.
C. Performance Indicators for
Multi-Objective Optimization
To investigate the performance of
MOEAs, the quality of the solution sets
obtained by MOEAs should be quantified
by tailored performance indicators
[36], which can be divided into three
spaces. By contrast,

IEEE Computational Intelligence Magazine - August 2022

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - August 2022

Contents
IEEE Computational Intelligence Magazine - August 2022 - Cover1
IEEE Computational Intelligence Magazine - August 2022 - Cover2
IEEE Computational Intelligence Magazine - August 2022 - Contents
IEEE Computational Intelligence Magazine - August 2022 - 2
IEEE Computational Intelligence Magazine - August 2022 - 3
IEEE Computational Intelligence Magazine - August 2022 - 4
IEEE Computational Intelligence Magazine - August 2022 - 5
IEEE Computational Intelligence Magazine - August 2022 - 6
IEEE Computational Intelligence Magazine - August 2022 - 7
IEEE Computational Intelligence Magazine - August 2022 - 8
IEEE Computational Intelligence Magazine - August 2022 - 9
IEEE Computational Intelligence Magazine - August 2022 - 10
IEEE Computational Intelligence Magazine - August 2022 - 11
IEEE Computational Intelligence Magazine - August 2022 - 12
IEEE Computational Intelligence Magazine - August 2022 - 13
IEEE Computational Intelligence Magazine - August 2022 - 14
IEEE Computational Intelligence Magazine - August 2022 - 15
IEEE Computational Intelligence Magazine - August 2022 - 16
IEEE Computational Intelligence Magazine - August 2022 - 17
IEEE Computational Intelligence Magazine - August 2022 - 18
IEEE Computational Intelligence Magazine - August 2022 - 19
IEEE Computational Intelligence Magazine - August 2022 - 20
IEEE Computational Intelligence Magazine - August 2022 - 21
IEEE Computational Intelligence Magazine - August 2022 - 22
IEEE Computational Intelligence Magazine - August 2022 - 23
IEEE Computational Intelligence Magazine - August 2022 - 24
IEEE Computational Intelligence Magazine - August 2022 - 25
IEEE Computational Intelligence Magazine - August 2022 - 26
IEEE Computational Intelligence Magazine - August 2022 - 27
IEEE Computational Intelligence Magazine - August 2022 - 28
IEEE Computational Intelligence Magazine - August 2022 - 29
IEEE Computational Intelligence Magazine - August 2022 - 30
IEEE Computational Intelligence Magazine - August 2022 - 31
IEEE Computational Intelligence Magazine - August 2022 - 32
IEEE Computational Intelligence Magazine - August 2022 - 33
IEEE Computational Intelligence Magazine - August 2022 - 34
IEEE Computational Intelligence Magazine - August 2022 - 35
IEEE Computational Intelligence Magazine - August 2022 - 36
IEEE Computational Intelligence Magazine - August 2022 - 37
IEEE Computational Intelligence Magazine - August 2022 - 38
IEEE Computational Intelligence Magazine - August 2022 - 39
IEEE Computational Intelligence Magazine - August 2022 - 40
IEEE Computational Intelligence Magazine - August 2022 - 41
IEEE Computational Intelligence Magazine - August 2022 - 42
IEEE Computational Intelligence Magazine - August 2022 - 43
IEEE Computational Intelligence Magazine - August 2022 - 44
IEEE Computational Intelligence Magazine - August 2022 - 45
IEEE Computational Intelligence Magazine - August 2022 - 46
IEEE Computational Intelligence Magazine - August 2022 - 47
IEEE Computational Intelligence Magazine - August 2022 - 48
IEEE Computational Intelligence Magazine - August 2022 - 49
IEEE Computational Intelligence Magazine - August 2022 - 50
IEEE Computational Intelligence Magazine - August 2022 - 51
IEEE Computational Intelligence Magazine - August 2022 - 52
IEEE Computational Intelligence Magazine - August 2022 - 53
IEEE Computational Intelligence Magazine - August 2022 - 54
IEEE Computational Intelligence Magazine - August 2022 - 55
IEEE Computational Intelligence Magazine - August 2022 - 56
IEEE Computational Intelligence Magazine - August 2022 - 57
IEEE Computational Intelligence Magazine - August 2022 - 58
IEEE Computational Intelligence Magazine - August 2022 - 59
IEEE Computational Intelligence Magazine - August 2022 - 60
IEEE Computational Intelligence Magazine - August 2022 - 61
IEEE Computational Intelligence Magazine - August 2022 - 62
IEEE Computational Intelligence Magazine - August 2022 - 63
IEEE Computational Intelligence Magazine - August 2022 - 64
IEEE Computational Intelligence Magazine - August 2022 - 65
IEEE Computational Intelligence Magazine - August 2022 - 66
IEEE Computational Intelligence Magazine - August 2022 - 67
IEEE Computational Intelligence Magazine - August 2022 - 68
IEEE Computational Intelligence Magazine - August 2022 - Cover3
IEEE Computational Intelligence Magazine - August 2022 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com