IEEE Computational Intelligence Magazine - May 2020 - 60

distributed algorithms. Another is to
design a customized private version of
the specific deep learning models such as
long short-term memory networks
(LSTM), generative neural networks
(GAN) and their variants, etc.
1) Centralized Training Algorithms
Deep learning models are often opti-
mized by gradient descent or its variants
through which we can minimize the
nonlinear objective function and find
optimal model parameters. In most cases,
the training data are held centrally, which
we called centralized training. In an early
work, stochastic gradient descent with
differentially private updates has been
derived for general convex objectives
[72]. Inspired by this, an intuitive method
is adding random noise to gradients for
privacy preserving.
NoisySGD, a differentially private
version of the SGD algorithm, is pro-
posed in [73] for preserving privacy of
training data through differentially pri-
vate optimization. Based on the basic
Laplace mechanism, NoisySGD clips
each gradient by l 2 norm, groups the
batches into a lot and adds noise to the
sum of gradients of each lot, working on
the training process of non-convex deep
learning models. Further, it refines the
privacy loss analysis by the modest
moments accountant. It can control the
effect of training data over the course of
SGD computation and output a pri-
vacy-preserving deep learning model.
Following the learning architecture of
NoisySGD, other differentially private
deep learning techniques based on gra-
dient perturbation have been proposed
in [74]-[80].
In NoisySGD [73], the amount of
random noise and the privacy budget
remain increasing with the increase in
the number of training epochs, which is
not expected since privacy budget is
usually limited. Besides, the amount of
noise remains unchanged regardless of
the importance of different parameters
in existing differentially private deep
learning techniques. The work of
Shokri et al. [81], also suffers the same
problems. To tackle these drawbacks,
Phan et al. [82] proposed a highly effec-

60

tive mechanism for differential privacy
preservation in deep learning. Laplace
noise is added to the affine transforma-
tions of neurons and the loss functions
only once. The input features are adap-
tively perturbed according to the con-
tribution of different features upon the
model output. In addition, it can be
applied in various deep models as Noi-
sySGD [73].
2) Distributed Training Algorithms
A prerequisite for centralized training
algorithms is the massive data available
for training the deep learning model.
Nevertheless, an institution only owns
limited amount of data in general, which
may result in overfitting while training
deep learning models. Even more, a
crowdsourced data collection suffers
from obvious privacy issues because data
owners can neither delete nor restrict
the purpose once collected. Recently,
there emerges a paradigm of distributed
deep learning where multiple partici-
pants jointly train a deep learning model
through a central server to achieve com-
mon objectives without sharing the pri-
vate data.
Shokri et al. [81] presented a pioneer
work of incorporating differential pri-
vacy into distributed deep learning.
They carefully designed a practical train-
ing framework that enables multiple
participants to collaboratively learn a
desirable deep learning model without
sharing their own training data. Under
the assumption that different participants
have the same objective function in
advance, this framework is optimized by
the proposed distributed selective SGD
protocol. In this protocol, each partici-
pant independently trains their local
model on their own dataset and only
asynchronously uploads part of trun-
cated and perturbed gradients, under a
consistent differential privacy mecha-
nism. Each participant can download
latest parameters shared by other partici-
pants to enhance its own local model.
This protocol explicitly avoids the leak-
age of the sensitive information and the
empirical evaluation proves that it can
actually achieve comparable accuracy to
conventional SGD.

IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE | MAY 2020

Zhang et al. [83] proposed another
method for privacy preserving in multi-
party deep learning, whose scenario is
similar to that in [81]. For the reason
that each party operates under the local
private context, the injected randomiza-
tion is often overly conservative, result-
ing in great uncertainty of information
disclosure and significant utility loss in
the global model. To solve this issue, this
method not only enforces differentially
private randomization to local gradients
in local models but also considers t -vis-
ibility for obtaining more secure aggre-
gation of local gradients based on
homomorphic encryption and threshold
secret sharing. Through the synergy of
multi-participants, it can provide us with
powerful privacy assurance and high
effectiveness simultaneously.
Following the privacy-preserving
training process of deep neural networks
in NoisySGD, Chase et al. [74] married
differential privacy with secure multi-
party computation to avoid the privacy
leakage in collaborative machine learn-
ing. They designed a protocol of training
collaborative neural networks, in which
the private gradient descent method
adds random noise from an appropriate
distribution to gradients. Then the col-
laborative private gradient descent
method ensures that the compounded
information per mini-batch will not be
disclosed too much with the increase of
the number of participants.
Different from the above differen-
tially private distributed training frame-
works [74], [81], [83] that add random
noise to gradients, Papernot et al. [84]
introduced Pr ivate Aggregation of
Teacher Ensembles (PATE) for learning
generally applicable privacy-preserving
models from disjoint private data, agnos-
tic to model details and optimization
algorithms. Different data owners with
the same machine learning tasks train
their own teacher model on disjoint
sensitive data independently. The votes
of multiple teacher models can be
aggregated and the Laplace noise is
added to aggregation results:
' n j ^x h + Lap c 1 m1, (16)
f ^xh = argmax
j
c



IEEE Computational Intelligence Magazine - May 2020

Table of Contents for the Digital Edition of IEEE Computational Intelligence Magazine - May 2020

IEEE Computational Intelligence Magazine - May 2020 - Cover1
IEEE Computational Intelligence Magazine - May 2020 - Cover2
IEEE Computational Intelligence Magazine - May 2020 - 1
IEEE Computational Intelligence Magazine - May 2020 - 2
IEEE Computational Intelligence Magazine - May 2020 - 3
IEEE Computational Intelligence Magazine - May 2020 - 4
IEEE Computational Intelligence Magazine - May 2020 - 5
IEEE Computational Intelligence Magazine - May 2020 - 6
IEEE Computational Intelligence Magazine - May 2020 - 7
IEEE Computational Intelligence Magazine - May 2020 - 8
IEEE Computational Intelligence Magazine - May 2020 - 9
IEEE Computational Intelligence Magazine - May 2020 - 10
IEEE Computational Intelligence Magazine - May 2020 - 11
IEEE Computational Intelligence Magazine - May 2020 - 12
IEEE Computational Intelligence Magazine - May 2020 - 13
IEEE Computational Intelligence Magazine - May 2020 - 14
IEEE Computational Intelligence Magazine - May 2020 - 15
IEEE Computational Intelligence Magazine - May 2020 - 16
IEEE Computational Intelligence Magazine - May 2020 - 17
IEEE Computational Intelligence Magazine - May 2020 - 18
IEEE Computational Intelligence Magazine - May 2020 - 19
IEEE Computational Intelligence Magazine - May 2020 - 20
IEEE Computational Intelligence Magazine - May 2020 - 21
IEEE Computational Intelligence Magazine - May 2020 - 22
IEEE Computational Intelligence Magazine - May 2020 - 23
IEEE Computational Intelligence Magazine - May 2020 - 24
IEEE Computational Intelligence Magazine - May 2020 - 25
IEEE Computational Intelligence Magazine - May 2020 - 26
IEEE Computational Intelligence Magazine - May 2020 - 27
IEEE Computational Intelligence Magazine - May 2020 - 28
IEEE Computational Intelligence Magazine - May 2020 - 29
IEEE Computational Intelligence Magazine - May 2020 - 30
IEEE Computational Intelligence Magazine - May 2020 - 31
IEEE Computational Intelligence Magazine - May 2020 - 32
IEEE Computational Intelligence Magazine - May 2020 - 33
IEEE Computational Intelligence Magazine - May 2020 - 34
IEEE Computational Intelligence Magazine - May 2020 - 35
IEEE Computational Intelligence Magazine - May 2020 - 36
IEEE Computational Intelligence Magazine - May 2020 - 37
IEEE Computational Intelligence Magazine - May 2020 - 38
IEEE Computational Intelligence Magazine - May 2020 - 39
IEEE Computational Intelligence Magazine - May 2020 - 40
IEEE Computational Intelligence Magazine - May 2020 - 41
IEEE Computational Intelligence Magazine - May 2020 - 42
IEEE Computational Intelligence Magazine - May 2020 - 43
IEEE Computational Intelligence Magazine - May 2020 - 44
IEEE Computational Intelligence Magazine - May 2020 - 45
IEEE Computational Intelligence Magazine - May 2020 - 46
IEEE Computational Intelligence Magazine - May 2020 - 47
IEEE Computational Intelligence Magazine - May 2020 - 48
IEEE Computational Intelligence Magazine - May 2020 - 49
IEEE Computational Intelligence Magazine - May 2020 - 50
IEEE Computational Intelligence Magazine - May 2020 - 51
IEEE Computational Intelligence Magazine - May 2020 - 52
IEEE Computational Intelligence Magazine - May 2020 - 53
IEEE Computational Intelligence Magazine - May 2020 - 54
IEEE Computational Intelligence Magazine - May 2020 - 55
IEEE Computational Intelligence Magazine - May 2020 - 56
IEEE Computational Intelligence Magazine - May 2020 - 57
IEEE Computational Intelligence Magazine - May 2020 - 58
IEEE Computational Intelligence Magazine - May 2020 - 59
IEEE Computational Intelligence Magazine - May 2020 - 60
IEEE Computational Intelligence Magazine - May 2020 - 61
IEEE Computational Intelligence Magazine - May 2020 - 62
IEEE Computational Intelligence Magazine - May 2020 - 63
IEEE Computational Intelligence Magazine - May 2020 - 64
IEEE Computational Intelligence Magazine - May 2020 - 65
IEEE Computational Intelligence Magazine - May 2020 - 66
IEEE Computational Intelligence Magazine - May 2020 - 67
IEEE Computational Intelligence Magazine - May 2020 - 68
IEEE Computational Intelligence Magazine - May 2020 - 69
IEEE Computational Intelligence Magazine - May 2020 - 70
IEEE Computational Intelligence Magazine - May 2020 - 71
IEEE Computational Intelligence Magazine - May 2020 - 72
IEEE Computational Intelligence Magazine - May 2020 - 73
IEEE Computational Intelligence Magazine - May 2020 - 74
IEEE Computational Intelligence Magazine - May 2020 - 75
IEEE Computational Intelligence Magazine - May 2020 - 76
IEEE Computational Intelligence Magazine - May 2020 - 77
IEEE Computational Intelligence Magazine - May 2020 - 78
IEEE Computational Intelligence Magazine - May 2020 - 79
IEEE Computational Intelligence Magazine - May 2020 - 80
IEEE Computational Intelligence Magazine - May 2020 - 81
IEEE Computational Intelligence Magazine - May 2020 - 82
IEEE Computational Intelligence Magazine - May 2020 - 83
IEEE Computational Intelligence Magazine - May 2020 - 84
IEEE Computational Intelligence Magazine - May 2020 - 85
IEEE Computational Intelligence Magazine - May 2020 - 86
IEEE Computational Intelligence Magazine - May 2020 - 87
IEEE Computational Intelligence Magazine - May 2020 - 88
IEEE Computational Intelligence Magazine - May 2020 - 89
IEEE Computational Intelligence Magazine - May 2020 - 90
IEEE Computational Intelligence Magazine - May 2020 - 91
IEEE Computational Intelligence Magazine - May 2020 - 92
IEEE Computational Intelligence Magazine - May 2020 - Cover3
IEEE Computational Intelligence Magazine - May 2020 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202311
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202308
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202305
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202302
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202211
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202208
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202205
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202202
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202111
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202108
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202105
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202102
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202011
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202008
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202005
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_202002
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201911
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201908
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201905
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201902
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201811
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201808
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201805
https://www.nxtbook.com/nxtbooks/ieee/computationalintelligence_201802
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring17
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring16
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring15
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring14
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_summer13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_spring13
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_winter12
https://www.nxtbook.com/nxtbooks/ieee/computational_intelligence_fall12
https://www.nxtbookmedia.com