IEEE Computational Intelligence Magazine - May 2021 - 19

transfer, a mixture model to unify the target and source distributions is defined as:
m ^w h = a r r ^w h + a { { ^w h, (6)

	

where a r and a { are the mixing coefficients for the target and
source distribution components, respectively. This mixture
model can be initiated with arbitrary mixing coefficients, subject to the constraint a r + a { = 1. In each iteration of neuroevolution, a predefined parameter is used to ascertain if the
knowledge transfer mode is to be activated. In the transfer mode,
m pseudo-offspring are sampled from the mixture model m ^ w h
instead of the search distribution r(w|i), in effect inducing
pseudo-offspring from the source to the target problem. Otherwise, the algorithm continues to draw pseudo-offspring from
the target search distribution.
It is important to note that the target search distribution is
evolving (see Section IV-C), while the source distribution is
fixed during the optimization process. The key step is how to
dynamically update the mixing coefficients a r and a { when
the search progresses, where a { can be viewed as the degree of
transfer from source to target. If the source problem is beneficial, we would want to increase a { so that more high-quality
pseudo-offspring can be sampled from the source distribution
to influence the target search. However, after a certain point,
the pseudo-offspring from the source may no longer remain
competitive with those sampled from the evolved target search
distribution. At that point, we would want to gradually reduce
a {, eventually deactivating the source distribution from the
mixture model, i.e., setting a { = 0.
Decidedly, how beneficial a pseudo-offspring is to the target problem shall be reflected by its fitness. To this end, we
propose to update the mixing coefficients towards better
expected fitness under the mixture model, with the following
generalization of (3):
Jm =

	

#

f ^w h 6a r r ^w h + a { { ^w h@ dw.(7)

Then, the gradient estimates for a r and a { on the expected
fitness can be derived, using the log-likelihood trick, as:
da J m =
r

/ f ^w kh d
m

ar

k=1

	
=

/ f ^w kh a
m

k=1

r

log ^a r r ^w kh + a { { ^w khh
r ^w kh

r ^w kh + a { { ^w kh

,


(8a)

and,
	

da J m =
{

/ f ^w kh a
m

k=1

r

{ ^w kh

r ^w kh + a { { ^w kh

, (8b)

given m pseudo-offspring w 1, w 2, f, w m sampled from the
mixture model and their fitness f (w 1), f (w 2), f, f (w m). With
these gradient estimates, the mixing coefficients can be updated
as follows:

	

ar

! a r + h a $ d a J m, (9a)

	

a{

! a { + h a $ d a J m, (9b)

r

{

where h a is the learning rate. Note that the constraint a r + a { = 1
can be easily imposed by normalization. The mixture model formulation described above can be seamlessly extended to multiple
components for multiple sources transfer, making the method
more powerful since the chance of having useful experience
included increases with the number of sources.
B. Influencing Evolution of the Target Search Distribution

After sampling pseudo-offspring from the mixture model and
evaluating their fitness, the target search distribution is to be
updated. Broadly speaking, we expect the target distribution to
gradually evolve with the guidance of pseudo-offspring transferred from the source. In this way, it has a greater chance to
explore diverse and high-quality solutions along the search path.
However, absurdly large changes to the mean n of the target
search distribution in a single update step should be avoided.
At the same time, it is not advisable to be overly greedy in
expanding the covariance matrix R when good pseudo-offspring (from the source) are very far away from the target distribution. A direct induction of pseudo-offspring from the
source may thus result in such deleterious outcomes, often
leading to numerical instability issues.
To understand this issue, let us assume a pseudo-offspring
w k with fitness f ^w kh being induced from the source distribution. Its contribution to the distributional parameter gradient
estimates are f ^w kh z k and f ^w kh^z k z kT - I h [49], [50], where
z k = A -1^w k - n h maps w k into the target search distribution's
natural coordinates; n and A are the center and the square root
of the covariance matrix of the target search distribution,
respectively. If w k is far from n in relation to A, then the value
z k = A -1^w k - n h will be extremely large. As a result, z k z Tk
will be further exaggerated. This could cause the gradient estimates to (numerically) explode, causing failure of the distributional parameter updates. Such instability is likely to occur even
in moderate dimensions, due to distribution sparsity as a consequence of the curse of dimensionality.
To overcome the instability issue, we propose to project the
source distribution's pseudo-offspring closer to the target
search distribution before updating the distributional parameters. While doing so, however, the original fitness values of the
projected offspring are retained. A simple strategy to perform
such projection, under the assumption that the target follows a
multivariate normal distribution-a common practice in most
probabilistic model-based evolution strategies-is outlined
below. First, we define a tunable parameter r to represent a
threshold Mahalanobis distance from the distribution center.
Choosing, say, r = 3 implies that only those pseudo-offspring
that are more than 3 Mahalanobis distance away from the target distribution center are projected. In particular, the direction
vector d k = w k - n is computed, and then the corresponding
pseudo-offspring w k is projected to wu k as shown below (so that
it lies within r Mahalanobis distance from the center):

MAY 2021 | IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE

19



IEEE Computational Intelligence Magazine - May 2021

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - May 2021

Contents
IEEE Computational Intelligence Magazine - May 2021 - Cover1
IEEE Computational Intelligence Magazine - May 2021 - Cover2
IEEE Computational Intelligence Magazine - May 2021 - Contents
IEEE Computational Intelligence Magazine - May 2021 - 2
IEEE Computational Intelligence Magazine - May 2021 - 3
IEEE Computational Intelligence Magazine - May 2021 - 4
IEEE Computational Intelligence Magazine - May 2021 - 5
IEEE Computational Intelligence Magazine - May 2021 - 6
IEEE Computational Intelligence Magazine - May 2021 - 7
IEEE Computational Intelligence Magazine - May 2021 - 8
IEEE Computational Intelligence Magazine - May 2021 - 9
IEEE Computational Intelligence Magazine - May 2021 - 10
IEEE Computational Intelligence Magazine - May 2021 - 11
IEEE Computational Intelligence Magazine - May 2021 - 12
IEEE Computational Intelligence Magazine - May 2021 - 13
IEEE Computational Intelligence Magazine - May 2021 - 14
IEEE Computational Intelligence Magazine - May 2021 - 15
IEEE Computational Intelligence Magazine - May 2021 - 16
IEEE Computational Intelligence Magazine - May 2021 - 17
IEEE Computational Intelligence Magazine - May 2021 - 18
IEEE Computational Intelligence Magazine - May 2021 - 19
IEEE Computational Intelligence Magazine - May 2021 - 20
IEEE Computational Intelligence Magazine - May 2021 - 21
IEEE Computational Intelligence Magazine - May 2021 - 22
IEEE Computational Intelligence Magazine - May 2021 - 23
IEEE Computational Intelligence Magazine - May 2021 - 24
IEEE Computational Intelligence Magazine - May 2021 - 25
IEEE Computational Intelligence Magazine - May 2021 - 26
IEEE Computational Intelligence Magazine - May 2021 - 27
IEEE Computational Intelligence Magazine - May 2021 - 28
IEEE Computational Intelligence Magazine - May 2021 - 29
IEEE Computational Intelligence Magazine - May 2021 - 30
IEEE Computational Intelligence Magazine - May 2021 - 31
IEEE Computational Intelligence Magazine - May 2021 - 32
IEEE Computational Intelligence Magazine - May 2021 - 33
IEEE Computational Intelligence Magazine - May 2021 - 34
IEEE Computational Intelligence Magazine - May 2021 - 35
IEEE Computational Intelligence Magazine - May 2021 - 36
IEEE Computational Intelligence Magazine - May 2021 - 37
IEEE Computational Intelligence Magazine - May 2021 - 38
IEEE Computational Intelligence Magazine - May 2021 - 39
IEEE Computational Intelligence Magazine - May 2021 - 40
IEEE Computational Intelligence Magazine - May 2021 - 41
IEEE Computational Intelligence Magazine - May 2021 - 42
IEEE Computational Intelligence Magazine - May 2021 - 43
IEEE Computational Intelligence Magazine - May 2021 - 44
IEEE Computational Intelligence Magazine - May 2021 - 45
IEEE Computational Intelligence Magazine - May 2021 - 46
IEEE Computational Intelligence Magazine - May 2021 - 47
IEEE Computational Intelligence Magazine - May 2021 - 48
IEEE Computational Intelligence Magazine - May 2021 - 49
IEEE Computational Intelligence Magazine - May 2021 - 50
IEEE Computational Intelligence Magazine - May 2021 - 51
IEEE Computational Intelligence Magazine - May 2021 - 52
IEEE Computational Intelligence Magazine - May 2021 - 53
IEEE Computational Intelligence Magazine - May 2021 - 54
IEEE Computational Intelligence Magazine - May 2021 - 55
IEEE Computational Intelligence Magazine - May 2021 - 56
IEEE Computational Intelligence Magazine - May 2021 - 57
IEEE Computational Intelligence Magazine - May 2021 - 58
IEEE Computational Intelligence Magazine - May 2021 - 59
IEEE Computational Intelligence Magazine - May 2021 - 60
IEEE Computational Intelligence Magazine - May 2021 - 61
IEEE Computational Intelligence Magazine - May 2021 - 62
IEEE Computational Intelligence Magazine - May 2021 - 63
IEEE Computational Intelligence Magazine - May 2021 - 64
IEEE Computational Intelligence Magazine - May 2021 - 65
IEEE Computational Intelligence Magazine - May 2021 - 66
IEEE Computational Intelligence Magazine - May 2021 - 67
IEEE Computational Intelligence Magazine - May 2021 - 68
IEEE Computational Intelligence Magazine - May 2021 - 69
IEEE Computational Intelligence Magazine - May 2021 - 70
IEEE Computational Intelligence Magazine - May 2021 - 71
IEEE Computational Intelligence Magazine - May 2021 - 72
IEEE Computational Intelligence Magazine - May 2021 - 73
IEEE Computational Intelligence Magazine - May 2021 - 74
IEEE Computational Intelligence Magazine - May 2021 - 75
IEEE Computational Intelligence Magazine - May 2021 - 76
IEEE Computational Intelligence Magazine - May 2021 - 77
IEEE Computational Intelligence Magazine - May 2021 - 78
IEEE Computational Intelligence Magazine - May 2021 - 79
IEEE Computational Intelligence Magazine - May 2021 - 80
IEEE Computational Intelligence Magazine - May 2021 - 81
IEEE Computational Intelligence Magazine - May 2021 - 82
IEEE Computational Intelligence Magazine - May 2021 - 83
IEEE Computational Intelligence Magazine - May 2021 - 84
IEEE Computational Intelligence Magazine - May 2021 - 85
IEEE Computational Intelligence Magazine - May 2021 - 86
IEEE Computational Intelligence Magazine - May 2021 - 87
IEEE Computational Intelligence Magazine - May 2021 - 88
IEEE Computational Intelligence Magazine - May 2021 - 89
IEEE Computational Intelligence Magazine - May 2021 - 90
IEEE Computational Intelligence Magazine - May 2021 - 91
IEEE Computational Intelligence Magazine - May 2021 - 92
IEEE Computational Intelligence Magazine - May 2021 - 93
IEEE Computational Intelligence Magazine - May 2021 - 94
IEEE Computational Intelligence Magazine - May 2021 - 95
IEEE Computational Intelligence Magazine - May 2021 - 96
IEEE Computational Intelligence Magazine - May 2021 - 97
IEEE Computational Intelligence Magazine - May 2021 - 98
IEEE Computational Intelligence Magazine - May 2021 - 99
IEEE Computational Intelligence Magazine - May 2021 - 100
IEEE Computational Intelligence Magazine - May 2021 - Cover3
IEEE Computational Intelligence Magazine - May 2021 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com