IEEE Computational Intelligence Magazine - May 2022 - 68

In this paper, two issues need to be addressed: 1) how to
design the cheap task of the AUC optimization problem,
and 2) how to transfer adequate knowledge between the
cheap task and the expensive AUC optimization problem.
(FPR) simultaneously. Several efforts [14]
employed the multiobjective genetic programming
(MOGP) algorithm for maximizing
AUC, which regarded the
problem of AUC maximization as a
ROC convex hull (ROCCH) maximization
problem. Recently, Zhao et al. [17],
[18] considered a third objective, complexity,
and proposed two novel 3D-convex-hull-based
multiobjective EAs
(MOEAs) for AUC maximization.
Because the convex hull is not the same
as the Pareto front, Qiu et al. [20] proposed
a novel multi-level knee pointbased
EA (MKnEA-AUC) to overcome
this limitation. However, current EAbased
methods cannot cope with largescale
datasets due to their expensive cost.
Many works employed online learning to
address this limitation, which can be
divided into two groups. The first group
used a fixed-size buffer to store several
sampled instances with various labels [21],
and then the pairwise loss functions are
calculated on this buffer. The second
group [2], [4] kept only the first and second
statistics for each instance and optimized
the AUC metric in one pass
through the data.
As mentioned above, batch learning
methods are preferable to online AUC
optimization methods if more attention is
paid to performance. However, the high
computational cost in evaluation makes it
challenging to apply batch AUC optimization
methods to large-scale datasets.
Moreover, online AUC optimization
methods have relatively low accuracy but
can handle large-scale datasets due to the
weak representation ability of the whole
dataset. AUC optimization is an expensive
problem, which demands the strategy
to balance convergence and computational
complexity. AUC optimization is
related to evolutionary machine learning
[49], [50]. When meeting large-scale data,
Franco et al. [53], [54] introduced the
integration of the GPU-based evaluation
with the ILAS windowing scheme [51],
[52]. Moreover, Franco et al. [50] introduced
an automatic tuning strategy for
rule-based evolutionary machine learning
by using problem structure identification,
especially for finding the adequate set of
hyperparameter values that is a very
expensive process. Consequently, none of
the current AUC methods attempts to
develop cheap tasks to aid the performance
of the expensive problem. None
of them considers knowledge transfer
among cheap and expensive tasks, which
may promote AUC accuracy. The references
most relevant to this idea are the
multi-task Bayesian optimization [55] and
evolutionary machine learning with minions
[56], which used several tasks with
small data to help quickly optimize the
original task with big data. However,
these methods do not consider the characteristics
of AUC optimization with
highlights in positive and negative
instances. In this paper, two issues need to
be addressed: 1) how to design the cheap
task of the AUC optimization problem,
and 2) how to transfer adequate knowledge
between the cheap task and the
expensive AUC optimization problem.
This paper develops an evolutionary
multitasking AUC optimization framework
to address two issues, termed
EMTAUC. Due to the pair learning
characteristics of AUC optimization, the
function evaluation requires more time
with the increasing number of instances.
Thus, a small-scale dataset by sampling
from the original dataset is established to
solve the first issue. In this way, less time is
needed to evaluate the AUC in the
designed cheap task with the small-scale
dataset. Moreover, due to the fast convergence
of the cheap task, dynamically
adjusting the data structure of inexpensive
tasks is proposed to increase the
diversity of knowledge contained in
cheap tasks. Compared with the original
task, the designed inexpensive task has
68 IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | MAY 2022
partial knowledge because it contains
part of the dataset. Thus, it is necessary to
bridge the gap between the original task
and the cheap task by dynamically adjusting
the items of the small-scale dataset in
the cheap task. In this strategy, the items
with low AUC scores are filled into the
sampled dataset. The solution of the second
issue is inspired by evolutionary
multitasking optimization (EMTO) [11],
[25], a new paradigm for solving multiple
optimization tasks by taking advantage of
the parallelism mechanism of the human
brain and EA. Recently, EMTO has been
successfully applied to overcome many
practical challenges [25]-[45] owing to its
powerful search capability and easy scalability.
These studies show that transferring
valuable knowledge across tasks can
improve the convergence of EAs. Unlike
the existing surrogated-assisted EAs [46],
[47], evolutionary multitasking optimization
(EMTO) is adopted to use the
knowledge of the designed cheap task
and the original task to improve the
AUC accuracy of a single task. To share
knowledge between these two tasks, a
multitasking AUC optimization environment
is first established, where the models
obtained from the cheap task are
transferred to improve the performance
of the expensive task and the better models
acquired from the expensive task are
shared to enhance the convergence of the
cheap task.
To validate the performance of
EMTUAC, a series of experiments on
real-world datasets
are conducted.
Moreover, four state-of-the-art EMTO
are embedded in EMTAUC. As shown
in the experimental results, while AUC
optimization is a desirable goal on its
own, joining the designed cheap task
and the original has a significant effect.
The attendance of an inexpensive task
significantly increases the AUC accuracy
compared with performing this task in
isolation. Moreover, the systematic comparison
with existing gradient-based
methods shows that EMTAUC matches
or exceeds all the other algorithms. The
highlights of the proposed EMTAUC
are summarized as follows:
1) An evolutionary multitasking framework
is first proposed to handle the

IEEE Computational Intelligence Magazine - May 2022

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - May 2022

Contents
IEEE Computational Intelligence Magazine - May 2022 - Cover1
IEEE Computational Intelligence Magazine - May 2022 - Cover2
IEEE Computational Intelligence Magazine - May 2022 - Contents
IEEE Computational Intelligence Magazine - May 2022 - 2
IEEE Computational Intelligence Magazine - May 2022 - 3
IEEE Computational Intelligence Magazine - May 2022 - 4
IEEE Computational Intelligence Magazine - May 2022 - 5
IEEE Computational Intelligence Magazine - May 2022 - 6
IEEE Computational Intelligence Magazine - May 2022 - 7
IEEE Computational Intelligence Magazine - May 2022 - 8
IEEE Computational Intelligence Magazine - May 2022 - 9
IEEE Computational Intelligence Magazine - May 2022 - 10
IEEE Computational Intelligence Magazine - May 2022 - 11
IEEE Computational Intelligence Magazine - May 2022 - 12
IEEE Computational Intelligence Magazine - May 2022 - 13
IEEE Computational Intelligence Magazine - May 2022 - 14
IEEE Computational Intelligence Magazine - May 2022 - 15
IEEE Computational Intelligence Magazine - May 2022 - 16
IEEE Computational Intelligence Magazine - May 2022 - 17
IEEE Computational Intelligence Magazine - May 2022 - 18
IEEE Computational Intelligence Magazine - May 2022 - 19
IEEE Computational Intelligence Magazine - May 2022 - 20
IEEE Computational Intelligence Magazine - May 2022 - 21
IEEE Computational Intelligence Magazine - May 2022 - 22
IEEE Computational Intelligence Magazine - May 2022 - 23
IEEE Computational Intelligence Magazine - May 2022 - 24
IEEE Computational Intelligence Magazine - May 2022 - 25
IEEE Computational Intelligence Magazine - May 2022 - 26
IEEE Computational Intelligence Magazine - May 2022 - 27
IEEE Computational Intelligence Magazine - May 2022 - 28
IEEE Computational Intelligence Magazine - May 2022 - 29
IEEE Computational Intelligence Magazine - May 2022 - 30
IEEE Computational Intelligence Magazine - May 2022 - 31
IEEE Computational Intelligence Magazine - May 2022 - 32
IEEE Computational Intelligence Magazine - May 2022 - 33
IEEE Computational Intelligence Magazine - May 2022 - 34
IEEE Computational Intelligence Magazine - May 2022 - 35
IEEE Computational Intelligence Magazine - May 2022 - 36
IEEE Computational Intelligence Magazine - May 2022 - 37
IEEE Computational Intelligence Magazine - May 2022 - 38
IEEE Computational Intelligence Magazine - May 2022 - 39
IEEE Computational Intelligence Magazine - May 2022 - 40
IEEE Computational Intelligence Magazine - May 2022 - 41
IEEE Computational Intelligence Magazine - May 2022 - 42
IEEE Computational Intelligence Magazine - May 2022 - 43
IEEE Computational Intelligence Magazine - May 2022 - 44
IEEE Computational Intelligence Magazine - May 2022 - 45
IEEE Computational Intelligence Magazine - May 2022 - 46
IEEE Computational Intelligence Magazine - May 2022 - 47
IEEE Computational Intelligence Magazine - May 2022 - 48
IEEE Computational Intelligence Magazine - May 2022 - 49
IEEE Computational Intelligence Magazine - May 2022 - 50
IEEE Computational Intelligence Magazine - May 2022 - 51
IEEE Computational Intelligence Magazine - May 2022 - 52
IEEE Computational Intelligence Magazine - May 2022 - 53
IEEE Computational Intelligence Magazine - May 2022 - 54
IEEE Computational Intelligence Magazine - May 2022 - 55
IEEE Computational Intelligence Magazine - May 2022 - 56
IEEE Computational Intelligence Magazine - May 2022 - 57
IEEE Computational Intelligence Magazine - May 2022 - 58
IEEE Computational Intelligence Magazine - May 2022 - 59
IEEE Computational Intelligence Magazine - May 2022 - 60
IEEE Computational Intelligence Magazine - May 2022 - 61
IEEE Computational Intelligence Magazine - May 2022 - 62
IEEE Computational Intelligence Magazine - May 2022 - 63
IEEE Computational Intelligence Magazine - May 2022 - 64
IEEE Computational Intelligence Magazine - May 2022 - 65
IEEE Computational Intelligence Magazine - May 2022 - 66
IEEE Computational Intelligence Magazine - May 2022 - 67
IEEE Computational Intelligence Magazine - May 2022 - 68
IEEE Computational Intelligence Magazine - May 2022 - 69
IEEE Computational Intelligence Magazine - May 2022 - 70
IEEE Computational Intelligence Magazine - May 2022 - 71
IEEE Computational Intelligence Magazine - May 2022 - 72
IEEE Computational Intelligence Magazine - May 2022 - 73
IEEE Computational Intelligence Magazine - May 2022 - 74
IEEE Computational Intelligence Magazine - May 2022 - 75
IEEE Computational Intelligence Magazine - May 2022 - 76
IEEE Computational Intelligence Magazine - May 2022 - 77
IEEE Computational Intelligence Magazine - May 2022 - 78
IEEE Computational Intelligence Magazine - May 2022 - 79
IEEE Computational Intelligence Magazine - May 2022 - 80
IEEE Computational Intelligence Magazine - May 2022 - 81
IEEE Computational Intelligence Magazine - May 2022 - 82
IEEE Computational Intelligence Magazine - May 2022 - 83
IEEE Computational Intelligence Magazine - May 2022 - 84
IEEE Computational Intelligence Magazine - May 2022 - 85
IEEE Computational Intelligence Magazine - May 2022 - 86
IEEE Computational Intelligence Magazine - May 2022 - 87
IEEE Computational Intelligence Magazine - May 2022 - 88
IEEE Computational Intelligence Magazine - May 2022 - 89
IEEE Computational Intelligence Magazine - May 2022 - 90
IEEE Computational Intelligence Magazine - May 2022 - 91
IEEE Computational Intelligence Magazine - May 2022 - 92
IEEE Computational Intelligence Magazine - May 2022 - 93
IEEE Computational Intelligence Magazine - May 2022 - 94
IEEE Computational Intelligence Magazine - May 2022 - 95
IEEE Computational Intelligence Magazine - May 2022 - 96
IEEE Computational Intelligence Magazine - May 2022 - 97
IEEE Computational Intelligence Magazine - May 2022 - 98
IEEE Computational Intelligence Magazine - May 2022 - 99
IEEE Computational Intelligence Magazine - May 2022 - 100
IEEE Computational Intelligence Magazine - May 2022 - 101
IEEE Computational Intelligence Magazine - May 2022 - 102
IEEE Computational Intelligence Magazine - May 2022 - 103
IEEE Computational Intelligence Magazine - May 2022 - 104
IEEE Computational Intelligence Magazine - May 2022 - Cover3
IEEE Computational Intelligence Magazine - May 2022 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com