IEEE Signal Processing Magazine - January 2018 - 31

ways to incorporate domain expertise within the neural network framework. We describe how, with carefully designed
choices, we can guide our neural networks to learn operations
similar to those implemented by analytical models. This may
be seen as a way to incorporate domain expertise within the
DNN framework. Similarly to the residual learning approach
described in the section "Facilitating the Learning Process:
Gradual Refinement with Deep Residual CNNs," guiding the
neural network to perform known analytical steps may facilitate the learning procedure of the models, hence resulting in
better solutions.
Nonblind deconvolution, e.g., is an inverse problem for
which we are provided with domain knowledge (more specifically here the knowledge of the degradation system operating
on the original image) that should not be ignored when designing the inverse system. Xu et al. [45] propose to inject their
knowledge of the blur kernel directly into the weights of their
neural network architecture. More specifically, they use the
singular value decomposition of the pseudo-inverse of their
blur kernel to initialize the weights of their large 1-D convolution kernels. This provides their model with a good starting
point to learn an operation similar to nonblind inverse filtering. Wang et al. [46] choose to design a multilayer neural network that mimics the operations implemented by the unfolded
LISTA for SR. The authors use their knowledge of each step
implemented by the LISTA algorithm [47] to fix their initial
weights to precomputed values. Both Xu et al. [45] and Wang
et al. [46] show that making an explicit choice of weight initialization using knowledge of another algorithm improves
the final performance of their model compared with random
weight initialization.
LISTA [47] represents a simple case of an unfolding algorithm [48] that aims to combine the advantages of both analytical approaches and neural networks. The basic idea is to start
with an analytical approach and an associated inference algorithm, and unfold the inference iterations as layers in a deep
network. After the size of the network is fixed, it is trained
to perform accurate inference. In [48], the unfolding framework is shown to be able to interpret conventional networks as
mean-field inference in Markov random fields, and obtain new
architectures by instead using belief propagation as the inference algorithm.
A similar approach is taken by Yang et al. [49], who designed
a DNN for reconstructing magnetic resonance images from
CS measurements. Each layer in their deep network graph is
explicitly implemented to mimic the step of the ADMM [50]
optimization procedure. The neural network parameters to be
learned include a nonlinear transformation of the CS measurements, the shrinkage function, the regularization function, in
addition to the various hyperparameters of the ADMM algorithm. All of these unknowns are optimized as a result of the
training of the neural network. Yang et al. [49] showed that
choosing a neural network in this way could achieve higher
reconstruction performance than state-of-the-art methods.
Some authors have also enforced a fixed operation with nontrainable weights within the network to influence the model to

operate in a particular way. For example, the encoder-decoder
CNN designed by Fischer et al. [32] for optical flow estimation uses a correlation layer at the end of the encoder part of
their network architecture. After computing two separate sets
of feature maps for each of the two input video frames, the
correlation layer explicitly computes the correlations between
the two sets. The result of this correlation is then used by the
decoder part of their network. Fischer et al. [32] hypothesize
that explicitly incorporating this correlation layer, instead of
having the network learn the operation, facilitates the learning process. Similarly, in their work on blind deconvolution,
Schuler et al. [51] perform end-to-end training of a deep layered architecture of which the first layers, corresponding to the
feature extraction step, are learned through training a CNN,
but the other two modules of the architecture are fixed and
correspond to operations implemented by traditional image
deconvolution. Their deep network learns to deblur the input
image by iteratively alternating between the three modules.

Using neural networks as denoisers in variable
splitting-based optimization methods
Recently, a new approach to directly combine analytical optimization methods with DNNs has been proposed. With the
use of variable splitting techniques, such as the ADMM and
the HQS methods [50], the inverse problem is split into two
subproblems: a fidelity term subproblem and a regularization
subproblem. The inverse problem is solved by alternating optimization. Recent research has proposed the use of DNNs to
tackle the regularization subproblem. More specifically, in the
context of the variable splitting methods, the regularization
step can be interpreted as a denoising procedure, in which the
restored image at a particular step of the algorithm is mapped
to a more plausible image with the guidance of the prior term.
Instead of hand-engineering the prior term, DNNs have been
recently proposed performing the regularization step. Zhang
et al. [52], e.g., show that their set of learned denoisers can be
incorporated into their optimization framework to solve other
problems besides image denoising, such as image deblurring
and image SR. The set of denoiser-CNNs essentially act as
prior terms that regularize the optimization-based restoration procedure for the inverse problem at hand. Similarly,
because Chang et al.'s [53] encoder-decoder CNN is trained
in an adversarial learning context (discussed in the section
"Using Generative Adversarial Networks to Learn Posteriors
for the Inverse Problem"), it acquires a prior knowledge that is
directly extracted from the statistics of the images seen in the
training data set, and not dependent on the type of the inverse
problem we are trying to solve. This allows the authors to apply
their trained model to other inverse problem tasks, such as CS,
image inpainting, and image SR.

Making careful design choices for solving inverse problems
Special caution must be taken when using CNNs for regression tasks. Architectural choices that may work for a classifier CNN may harm the learning process of a CNN trained
for solving an inverse problem using regression. For example,

IEEE SIGNAL PROCESSING MAGAZINE

|

January 2018

|

31



Table of Contents for the Digital Edition of IEEE Signal Processing Magazine - January 2018

Contents
IEEE Signal Processing Magazine - January 2018 - Cover1
IEEE Signal Processing Magazine - January 2018 - Cover2
IEEE Signal Processing Magazine - January 2018 - Contents
IEEE Signal Processing Magazine - January 2018 - 2
IEEE Signal Processing Magazine - January 2018 - 3
IEEE Signal Processing Magazine - January 2018 - 4
IEEE Signal Processing Magazine - January 2018 - 5
IEEE Signal Processing Magazine - January 2018 - 6
IEEE Signal Processing Magazine - January 2018 - 7
IEEE Signal Processing Magazine - January 2018 - 8
IEEE Signal Processing Magazine - January 2018 - 9
IEEE Signal Processing Magazine - January 2018 - 10
IEEE Signal Processing Magazine - January 2018 - 11
IEEE Signal Processing Magazine - January 2018 - 12
IEEE Signal Processing Magazine - January 2018 - 13
IEEE Signal Processing Magazine - January 2018 - 14
IEEE Signal Processing Magazine - January 2018 - 15
IEEE Signal Processing Magazine - January 2018 - 16
IEEE Signal Processing Magazine - January 2018 - 17
IEEE Signal Processing Magazine - January 2018 - 18
IEEE Signal Processing Magazine - January 2018 - 19
IEEE Signal Processing Magazine - January 2018 - 20
IEEE Signal Processing Magazine - January 2018 - 21
IEEE Signal Processing Magazine - January 2018 - 22
IEEE Signal Processing Magazine - January 2018 - 23
IEEE Signal Processing Magazine - January 2018 - 24
IEEE Signal Processing Magazine - January 2018 - 25
IEEE Signal Processing Magazine - January 2018 - 26
IEEE Signal Processing Magazine - January 2018 - 27
IEEE Signal Processing Magazine - January 2018 - 28
IEEE Signal Processing Magazine - January 2018 - 29
IEEE Signal Processing Magazine - January 2018 - 30
IEEE Signal Processing Magazine - January 2018 - 31
IEEE Signal Processing Magazine - January 2018 - 32
IEEE Signal Processing Magazine - January 2018 - 33
IEEE Signal Processing Magazine - January 2018 - 34
IEEE Signal Processing Magazine - January 2018 - 35
IEEE Signal Processing Magazine - January 2018 - 36
IEEE Signal Processing Magazine - January 2018 - 37
IEEE Signal Processing Magazine - January 2018 - 38
IEEE Signal Processing Magazine - January 2018 - 39
IEEE Signal Processing Magazine - January 2018 - 40
IEEE Signal Processing Magazine - January 2018 - 41
IEEE Signal Processing Magazine - January 2018 - 42
IEEE Signal Processing Magazine - January 2018 - 43
IEEE Signal Processing Magazine - January 2018 - 44
IEEE Signal Processing Magazine - January 2018 - 45
IEEE Signal Processing Magazine - January 2018 - 46
IEEE Signal Processing Magazine - January 2018 - 47
IEEE Signal Processing Magazine - January 2018 - 48
IEEE Signal Processing Magazine - January 2018 - 49
IEEE Signal Processing Magazine - January 2018 - 50
IEEE Signal Processing Magazine - January 2018 - 51
IEEE Signal Processing Magazine - January 2018 - 52
IEEE Signal Processing Magazine - January 2018 - 53
IEEE Signal Processing Magazine - January 2018 - 54
IEEE Signal Processing Magazine - January 2018 - 55
IEEE Signal Processing Magazine - January 2018 - 56
IEEE Signal Processing Magazine - January 2018 - 57
IEEE Signal Processing Magazine - January 2018 - 58
IEEE Signal Processing Magazine - January 2018 - 59
IEEE Signal Processing Magazine - January 2018 - 60
IEEE Signal Processing Magazine - January 2018 - 61
IEEE Signal Processing Magazine - January 2018 - 62
IEEE Signal Processing Magazine - January 2018 - 63
IEEE Signal Processing Magazine - January 2018 - 64
IEEE Signal Processing Magazine - January 2018 - 65
IEEE Signal Processing Magazine - January 2018 - 66
IEEE Signal Processing Magazine - January 2018 - 67
IEEE Signal Processing Magazine - January 2018 - 68
IEEE Signal Processing Magazine - January 2018 - 69
IEEE Signal Processing Magazine - January 2018 - 70
IEEE Signal Processing Magazine - January 2018 - 71
IEEE Signal Processing Magazine - January 2018 - 72
IEEE Signal Processing Magazine - January 2018 - 73
IEEE Signal Processing Magazine - January 2018 - 74
IEEE Signal Processing Magazine - January 2018 - 75
IEEE Signal Processing Magazine - January 2018 - 76
IEEE Signal Processing Magazine - January 2018 - 77
IEEE Signal Processing Magazine - January 2018 - 78
IEEE Signal Processing Magazine - January 2018 - 79
IEEE Signal Processing Magazine - January 2018 - 80
IEEE Signal Processing Magazine - January 2018 - 81
IEEE Signal Processing Magazine - January 2018 - 82
IEEE Signal Processing Magazine - January 2018 - 83
IEEE Signal Processing Magazine - January 2018 - 84
IEEE Signal Processing Magazine - January 2018 - 85
IEEE Signal Processing Magazine - January 2018 - 86
IEEE Signal Processing Magazine - January 2018 - 87
IEEE Signal Processing Magazine - January 2018 - 88
IEEE Signal Processing Magazine - January 2018 - 89
IEEE Signal Processing Magazine - January 2018 - 90
IEEE Signal Processing Magazine - January 2018 - 91
IEEE Signal Processing Magazine - January 2018 - 92
IEEE Signal Processing Magazine - January 2018 - 93
IEEE Signal Processing Magazine - January 2018 - 94
IEEE Signal Processing Magazine - January 2018 - 95
IEEE Signal Processing Magazine - January 2018 - 96
IEEE Signal Processing Magazine - January 2018 - 97
IEEE Signal Processing Magazine - January 2018 - 98
IEEE Signal Processing Magazine - January 2018 - 99
IEEE Signal Processing Magazine - January 2018 - 100
IEEE Signal Processing Magazine - January 2018 - 101
IEEE Signal Processing Magazine - January 2018 - 102
IEEE Signal Processing Magazine - January 2018 - 103
IEEE Signal Processing Magazine - January 2018 - 104
IEEE Signal Processing Magazine - January 2018 - 105
IEEE Signal Processing Magazine - January 2018 - 106
IEEE Signal Processing Magazine - January 2018 - 107
IEEE Signal Processing Magazine - January 2018 - 108
IEEE Signal Processing Magazine - January 2018 - 109
IEEE Signal Processing Magazine - January 2018 - 110
IEEE Signal Processing Magazine - January 2018 - 111
IEEE Signal Processing Magazine - January 2018 - 112
IEEE Signal Processing Magazine - January 2018 - 113
IEEE Signal Processing Magazine - January 2018 - 114
IEEE Signal Processing Magazine - January 2018 - 115
IEEE Signal Processing Magazine - January 2018 - 116
IEEE Signal Processing Magazine - January 2018 - 117
IEEE Signal Processing Magazine - January 2018 - 118
IEEE Signal Processing Magazine - January 2018 - 119
IEEE Signal Processing Magazine - January 2018 - 120
IEEE Signal Processing Magazine - January 2018 - 121
IEEE Signal Processing Magazine - January 2018 - 122
IEEE Signal Processing Magazine - January 2018 - 123
IEEE Signal Processing Magazine - January 2018 - 124
IEEE Signal Processing Magazine - January 2018 - 125
IEEE Signal Processing Magazine - January 2018 - 126
IEEE Signal Processing Magazine - January 2018 - 127
IEEE Signal Processing Magazine - January 2018 - 128
IEEE Signal Processing Magazine - January 2018 - 129
IEEE Signal Processing Magazine - January 2018 - 130
IEEE Signal Processing Magazine - January 2018 - 131
IEEE Signal Processing Magazine - January 2018 - 132
IEEE Signal Processing Magazine - January 2018 - 133
IEEE Signal Processing Magazine - January 2018 - 134
IEEE Signal Processing Magazine - January 2018 - 135
IEEE Signal Processing Magazine - January 2018 - 136
IEEE Signal Processing Magazine - January 2018 - 137
IEEE Signal Processing Magazine - January 2018 - 138
IEEE Signal Processing Magazine - January 2018 - 139
IEEE Signal Processing Magazine - January 2018 - 140
IEEE Signal Processing Magazine - January 2018 - 141
IEEE Signal Processing Magazine - January 2018 - 142
IEEE Signal Processing Magazine - January 2018 - 143
IEEE Signal Processing Magazine - January 2018 - 144
IEEE Signal Processing Magazine - January 2018 - 145
IEEE Signal Processing Magazine - January 2018 - 146
IEEE Signal Processing Magazine - January 2018 - 147
IEEE Signal Processing Magazine - January 2018 - 148
IEEE Signal Processing Magazine - January 2018 - 149
IEEE Signal Processing Magazine - January 2018 - 150
IEEE Signal Processing Magazine - January 2018 - 151
IEEE Signal Processing Magazine - January 2018 - 152
IEEE Signal Processing Magazine - January 2018 - 153
IEEE Signal Processing Magazine - January 2018 - 154
IEEE Signal Processing Magazine - January 2018 - 155
IEEE Signal Processing Magazine - January 2018 - 156
IEEE Signal Processing Magazine - January 2018 - 157
IEEE Signal Processing Magazine - January 2018 - 158
IEEE Signal Processing Magazine - January 2018 - 159
IEEE Signal Processing Magazine - January 2018 - 160
IEEE Signal Processing Magazine - January 2018 - 161
IEEE Signal Processing Magazine - January 2018 - 162
IEEE Signal Processing Magazine - January 2018 - 163
IEEE Signal Processing Magazine - January 2018 - 164
IEEE Signal Processing Magazine - January 2018 - 165
IEEE Signal Processing Magazine - January 2018 - 166
IEEE Signal Processing Magazine - January 2018 - 167
IEEE Signal Processing Magazine - January 2018 - 168
IEEE Signal Processing Magazine - January 2018 - 169
IEEE Signal Processing Magazine - January 2018 - 170
IEEE Signal Processing Magazine - January 2018 - 171
IEEE Signal Processing Magazine - January 2018 - 172
IEEE Signal Processing Magazine - January 2018 - 173
IEEE Signal Processing Magazine - January 2018 - 174
IEEE Signal Processing Magazine - January 2018 - 175
IEEE Signal Processing Magazine - January 2018 - 176
IEEE Signal Processing Magazine - January 2018 - 177
IEEE Signal Processing Magazine - January 2018 - 178
IEEE Signal Processing Magazine - January 2018 - 179
IEEE Signal Processing Magazine - January 2018 - 180
IEEE Signal Processing Magazine - January 2018 - Cover3
IEEE Signal Processing Magazine - January 2018 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201809
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201807
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201805
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201803
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201801
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0917
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0717
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0517
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0317
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0916
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0716
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0516
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0316
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0915
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0715
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0515
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0315
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0914
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0714
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0514
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0314
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0913
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0713
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0513
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0313
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0912
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0712
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0512
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0312
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0911
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0711
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0511
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0311
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0910
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0710
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0510
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0310
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0909
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0709
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0509
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0309
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1108
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0908
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0708
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0508
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0308
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0108
https://www.nxtbookmedia.com