The Bridge - Issue 2, 2018 - 11

network trained with 3 nodes. The contour lines show
decision surfaces at different thresholds of the network.

One technique to reduce overfitting is dropout. The
dropout technique avoids overfitting by dropping out
different random nodes during training (Figure 2).

Artificial neural networks (ANN) are one type of
machine learning model. Similar to the human brain,
ANNs contain a network of nodes, or neurons, that
are interconnected. A deep neural network (DNN) is
simply an ANN with multiple hidden "layers".
A common obstacle with the application of neural
networks is overfitting. Overfitting occurs when the
network aligns too closely to the training data set.
This leads to the network having a high predictive
power with the training data set but a much lower
success rate with the test data set or live data.
Dropout
Dropout is a technique for addressing this problem.
The technique involves randomly dropping, or
eliminating, neurons from the network during training
(Srivastava 2014). This prevents units from coadapting too much.

Figure 2: The application of dropout to a neural network
(a) A network without dropout (b) One iteration of
training the network with a dropout of 50% on the
hidden layers.

The contributions of this work are:
* The effect of varying the dropout rates on datasets
that are non-time series, non-spatial, and consist of
heterogeneous inputs is studied for three datasets.
* There is not an optimum dropout rate shared
by the datasets with similar properties. The
experiments show that the optimal dropout rate
can be anywhere within its range from 0 to 1,
influenced by both the dataset selected and how
many training samples are used from the dataset.
* While previous publications show that high dropout
rates hurt performance of models trained on
small datasets (Srivastava 2014), the experiments
presented here show that dropout can improve
performance even with a training set as small as
450 samples.

BACKGROUND

For each backpropagation, a new set of nodes is
dropped out. At testing time, no dropout is applied.
Because each backpropagation drops out so many
nodes during training, the expressiveness of the
model must be sustained by increasing nodes,
layers, epochs, etc.
Following Baldi et. al (2013), this paper denotes
the fraction of nodes dropped out during a
backpropagation as q (this is the dropout rate),
and the number of nodes remaining as p (this
is the retention rate). This terminology helps to
avoid confusion between the dropout rate and
the parameter p, which both appear widely in the
literature, but mean different things.
While dropout is most often applied to the hidden
layers of a neural network, it can also be applied to
the model's input nodes. This can reduce overfitting
because the input layers can become redundant.
With dropout, the model learns to consider
redundant input nodes instead of relying on one.
Dropout also increases the number of iterations

HKN.ORG

11


http://www.HKN.ORG

Table of Contents for the Digital Edition of The Bridge - Issue 2, 2018

Contents
The Bridge - Issue 2, 2018 - Cover1
The Bridge - Issue 2, 2018 - Cover2
The Bridge - Issue 2, 2018 - Contents
The Bridge - Issue 2, 2018 - 4
The Bridge - Issue 2, 2018 - 5
The Bridge - Issue 2, 2018 - 6
The Bridge - Issue 2, 2018 - 7
The Bridge - Issue 2, 2018 - 8
The Bridge - Issue 2, 2018 - 9
The Bridge - Issue 2, 2018 - 10
The Bridge - Issue 2, 2018 - 11
The Bridge - Issue 2, 2018 - 12
The Bridge - Issue 2, 2018 - 13
The Bridge - Issue 2, 2018 - 14
The Bridge - Issue 2, 2018 - 15
The Bridge - Issue 2, 2018 - 16
The Bridge - Issue 2, 2018 - 17
The Bridge - Issue 2, 2018 - 18
The Bridge - Issue 2, 2018 - 19
The Bridge - Issue 2, 2018 - 20
The Bridge - Issue 2, 2018 - 21
The Bridge - Issue 2, 2018 - 22
The Bridge - Issue 2, 2018 - 23
The Bridge - Issue 2, 2018 - 24
The Bridge - Issue 2, 2018 - 25
The Bridge - Issue 2, 2018 - 26
The Bridge - Issue 2, 2018 - 27
The Bridge - Issue 2, 2018 - 28
The Bridge - Issue 2, 2018 - 29
The Bridge - Issue 2, 2018 - 30
The Bridge - Issue 2, 2018 - 31
The Bridge - Issue 2, 2018 - 32
The Bridge - Issue 2, 2018 - 33
The Bridge - Issue 2, 2018 - 34
The Bridge - Issue 2, 2018 - 35
The Bridge - Issue 2, 2018 - 36
The Bridge - Issue 2, 2018 - 37
The Bridge - Issue 2, 2018 - 38
The Bridge - Issue 2, 2018 - 39
The Bridge - Issue 2, 2018 - 40
The Bridge - Issue 2, 2018 - 41
The Bridge - Issue 2, 2018 - 42
The Bridge - Issue 2, 2018 - 43
The Bridge - Issue 2, 2018 - 44
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue3_2023
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue2_2023
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue1_2023
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue3_2022
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue2_2022
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue1_2022
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue3_2021
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue2_2021
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue1_2021
https://www.nxtbook.com/nxtbooks/ieee/bridge_2020_issue3
https://www.nxtbook.com/nxtbooks/ieee/bridge_2020_issue2
https://www.nxtbook.com/nxtbooks/ieee/bridge_2020_issue1
https://www.nxtbook.com/nxtbooks/ieee/bridge_2019_issue3
https://www.nxtbook.com/nxtbooks/ieee/bridge_2019_issue2
https://www.nxtbook.com/nxtbooks/ieee/bridge_2019_issue1
https://www.nxtbook.com/nxtbooks/ieee/bridge_2018_issue3
https://www.nxtbook.com/nxtbooks/ieee/bridge_2018_issue2
https://www.nxtbook.com/nxtbooks/ieee/bridge_2018_issue1
https://www.nxtbookmedia.com