Signal Processing - November 2017 - 152
Bayesian framework is designed to allow prior knowledge to
influence the estimation process in an optimal fashion.
Specifically, within a Bayesian framework, estimation of the
parameter vector i is derived from the joint pdf fX, i (x, i)
instead of solely the conditional (non-Bayesian) pdf fX i ^x ih .
From basic probability theory, the joint density can be expressed
as fX, i (x, i) = fi X ^i xh fX (x), where clearly the posterior density fi X ^i xh summarizes all the information needed to make
any inference on i based on the data x = {x m} mM= 1 . The joint
density can likewise be related to the conditional density that
models the parameter's influence on data measurements, i.e.,
fX, i (x, i) = fX i ^x ih fi (i). Prior knowledge about parameter
vector i is reflected in the prior pdf fi (i). When there is no
prior knowledge, all outcomes for the parameter vector can be
assumed to be equally likely. Such a noninformative prior pdf
often leads to results consistent with standard non-Bayesian
approaches, i.e., it yields algorithms and bounds that rely primarily on fX i ^x ih . Thus, the Bayesian framework in a sense can
be considered as a generalization of the non-Bayesian framework [27], [43], [44].
When the model is perfectly specified, the optimal Bayesian estimator under cost metrics, such as the squared error and
the uniform cost, depends primarily on the posterior distribution fi X ^i xh. Indeed, the squared error cost is minimized by
the conditional mean estimator it MSE (x) = E f X {i x}, and the
uniform cost is minimized by the maximum a posteriori (MAP)
estimator it MAP (x) = argmax fi X ^i xh [27], [44]. Under a perfect
i
model specification, the asymptotic properties of Bayes estimators
and of the posterior distribution have been investigated extensively. Under suitable conditions, as the number of data samples
increases, the Bayes estimator tends to become independent of
the prior distribution [27, Ch. 4]. Thus, the influence of the prior
distribution on a posteriori inferences decreases, and asymptotic
behavior similar to the non-Bayesian ML estimator emerges.
Indeed, strong consistency, efficiency, and normality properties
of Bayes estimators have been established for a large class of
prior distributions [41]. This asymptotic behavior has some intuitive appeal, since the prior pdf represents a statistical summary
of one's best guess (prior to an actual experiment) of the likelihood the desired parameter will assume any particular value. As
actual data measurements become available, however, it makes
sense that one will eventually abandon the guidance provided by
the prior pdf in light of the valuable information carried by the
data measurements obtained from the actual experiment. This
phenomenon is well established and has been observed in SP
applications. When the prior fi (i) is incorrect but the model
fX i ^x ih is correct, then it is possible that a significantly larger
number of data observations (or higher SNR) may be required
before the Bayes estimator becomes independent of the influence
of the incorrect prior pdf [21, p. 4737].
Misspecification within a Bayesian framework explores the
possibility that the assumed joint pdf fX, i (x, i) may be incorrect.
This, of course, includes the prior pdf fi (i) as well as the model
fX i ^x ih . Under model misspecification, the asymptotic properties of the posterior distribution also have been investigated
extensively. The following discussion attempts to summarize
i
152
some key results on this topic, although no claims are made here
that the summary is complete or exhaustive. The goal here is to
identify results of potential interest to the SP community in the
authors' viewpoint. The first discussion to follow will focus on
published results that detail the asymptotic behavior and properties of the Bayesian posterior distribution under model misspecification, i.e., the asymptotic behavior of fi X ^i xh as the
amount of data increases. These results can be considered the
Bayesian counterparts in the spirit of the contributions of Huber
[20] and White [46] that detail ML estimator performance
under misspecification, as discussed earlier. Second, a discussion of results on misspecified Bayesian bounds is given. As this
remains a relatively new area of research, there appear to be very
few published results on this topic; hence, a brief discussion of
some of the topic's inherent issues is also provided.
Bayesian estimation under misspecified models
Since Bayes estimators are derived from the posterior density
fi X ^i xh, considering its asymptotic behavior yields insights
into the convergence properties of the associated estimators.
Berk [4] was the first to investigate the asymptotic behavior of
the posterior distribution under misspecification as the number of data observations becomes arbitrarily large. Specifical--
ly, consider a set of i.i.d. data measurements x = {x m} mM= 1
M
according to joint pdf p X (x) = % m = 1 p X (x m). Let the
M
assumed pdf of x be fX ^x ih = % m = 1 fX ^x m ih and the
assumed prior pdf be fi (i) . Define the set H A such that
H A _ ) i ! H: argmin " -E p " ln fX ^x ih,,3 . (17)
i!H
For a large class of unimodal and well-behaved distributions,
the set H A consists of a single unique point, i.e., H A = {i 0},
but the definition clearly allows for the possibility that this set
contains more than one point. It is also noteworthy [see also
(1)] that the set H A is simply the set of all points/vectors
i ! H that minimize the KLD D ( p X fX i) between the true
and assumed distributions. Berk noted this relation to the KLD
in [4], i.e., prior to the Akaike [1] reference to Huber's work
[20]. Berk proved that, if H A = {i 0}, i.e., it consists of a single unique point i 0, then the following convergence in distribution holds:
fi X (i x) _ fi X (i x 1, x 2, f, x M) M"
d ^ i - i 0 h, (18)
"3
d.
w h e r e d (a) = d (a 1) d (a 2) g d (a d) a n d d (a) i s a D i r a c
delta function.
From (18), one can presume that i 0 is the counterpart for
the misspecified Bayesian estimation framework of the pseudotrue parameter vector introduced in (1). This conjecture is
validated by the fundamental results of Bunke and Milhaud
[6] that provide strong consistency arguments for a class of
mismatched (or pseudo) Bayesian (MB) estimators. Specifically, let L ($ ,$) be a nonnegative, real-valued loss function such
that L (i, i) = 0. A familiar example of this type of functions
is the one leading to the MSE between a given estimate it and
IEEE SIGNAL PROCESSING MAGAZINE
|
November 2017
|
Table of Contents for the Digital Edition of Signal Processing - November 2017
Signal Processing - November 2017 - Cover1
Signal Processing - November 2017 - Cover2
Signal Processing - November 2017 - 1
Signal Processing - November 2017 - 2
Signal Processing - November 2017 - 3
Signal Processing - November 2017 - 4
Signal Processing - November 2017 - 5
Signal Processing - November 2017 - 6
Signal Processing - November 2017 - 7
Signal Processing - November 2017 - 8
Signal Processing - November 2017 - 9
Signal Processing - November 2017 - 10
Signal Processing - November 2017 - 11
Signal Processing - November 2017 - 12
Signal Processing - November 2017 - 13
Signal Processing - November 2017 - 14
Signal Processing - November 2017 - 15
Signal Processing - November 2017 - 16
Signal Processing - November 2017 - 17
Signal Processing - November 2017 - 18
Signal Processing - November 2017 - 19
Signal Processing - November 2017 - 20
Signal Processing - November 2017 - 21
Signal Processing - November 2017 - 22
Signal Processing - November 2017 - 23
Signal Processing - November 2017 - 24
Signal Processing - November 2017 - 25
Signal Processing - November 2017 - 26
Signal Processing - November 2017 - 27
Signal Processing - November 2017 - 28
Signal Processing - November 2017 - 29
Signal Processing - November 2017 - 30
Signal Processing - November 2017 - 31
Signal Processing - November 2017 - 32
Signal Processing - November 2017 - 33
Signal Processing - November 2017 - 34
Signal Processing - November 2017 - 35
Signal Processing - November 2017 - 36
Signal Processing - November 2017 - 37
Signal Processing - November 2017 - 38
Signal Processing - November 2017 - 39
Signal Processing - November 2017 - 40
Signal Processing - November 2017 - 41
Signal Processing - November 2017 - 42
Signal Processing - November 2017 - 43
Signal Processing - November 2017 - 44
Signal Processing - November 2017 - 45
Signal Processing - November 2017 - 46
Signal Processing - November 2017 - 47
Signal Processing - November 2017 - 48
Signal Processing - November 2017 - 49
Signal Processing - November 2017 - 50
Signal Processing - November 2017 - 51
Signal Processing - November 2017 - 52
Signal Processing - November 2017 - 53
Signal Processing - November 2017 - 54
Signal Processing - November 2017 - 55
Signal Processing - November 2017 - 56
Signal Processing - November 2017 - 57
Signal Processing - November 2017 - 58
Signal Processing - November 2017 - 59
Signal Processing - November 2017 - 60
Signal Processing - November 2017 - 61
Signal Processing - November 2017 - 62
Signal Processing - November 2017 - 63
Signal Processing - November 2017 - 64
Signal Processing - November 2017 - 65
Signal Processing - November 2017 - 66
Signal Processing - November 2017 - 67
Signal Processing - November 2017 - 68
Signal Processing - November 2017 - 69
Signal Processing - November 2017 - 70
Signal Processing - November 2017 - 71
Signal Processing - November 2017 - 72
Signal Processing - November 2017 - 73
Signal Processing - November 2017 - 74
Signal Processing - November 2017 - 75
Signal Processing - November 2017 - 76
Signal Processing - November 2017 - 77
Signal Processing - November 2017 - 78
Signal Processing - November 2017 - 79
Signal Processing - November 2017 - 80
Signal Processing - November 2017 - 81
Signal Processing - November 2017 - 82
Signal Processing - November 2017 - 83
Signal Processing - November 2017 - 84
Signal Processing - November 2017 - 85
Signal Processing - November 2017 - 86
Signal Processing - November 2017 - 87
Signal Processing - November 2017 - 88
Signal Processing - November 2017 - 89
Signal Processing - November 2017 - 90
Signal Processing - November 2017 - 91
Signal Processing - November 2017 - 92
Signal Processing - November 2017 - 93
Signal Processing - November 2017 - 94
Signal Processing - November 2017 - 95
Signal Processing - November 2017 - 96
Signal Processing - November 2017 - 97
Signal Processing - November 2017 - 98
Signal Processing - November 2017 - 99
Signal Processing - November 2017 - 100
Signal Processing - November 2017 - 101
Signal Processing - November 2017 - 102
Signal Processing - November 2017 - 103
Signal Processing - November 2017 - 104
Signal Processing - November 2017 - 105
Signal Processing - November 2017 - 106
Signal Processing - November 2017 - 107
Signal Processing - November 2017 - 108
Signal Processing - November 2017 - 109
Signal Processing - November 2017 - 110
Signal Processing - November 2017 - 111
Signal Processing - November 2017 - 112
Signal Processing - November 2017 - 113
Signal Processing - November 2017 - 114
Signal Processing - November 2017 - 115
Signal Processing - November 2017 - 116
Signal Processing - November 2017 - 117
Signal Processing - November 2017 - 118
Signal Processing - November 2017 - 119
Signal Processing - November 2017 - 120
Signal Processing - November 2017 - 121
Signal Processing - November 2017 - 122
Signal Processing - November 2017 - 123
Signal Processing - November 2017 - 124
Signal Processing - November 2017 - 125
Signal Processing - November 2017 - 126
Signal Processing - November 2017 - 127
Signal Processing - November 2017 - 128
Signal Processing - November 2017 - 129
Signal Processing - November 2017 - 130
Signal Processing - November 2017 - 131
Signal Processing - November 2017 - 132
Signal Processing - November 2017 - 133
Signal Processing - November 2017 - 134
Signal Processing - November 2017 - 135
Signal Processing - November 2017 - 136
Signal Processing - November 2017 - 137
Signal Processing - November 2017 - 138
Signal Processing - November 2017 - 139
Signal Processing - November 2017 - 140
Signal Processing - November 2017 - 141
Signal Processing - November 2017 - 142
Signal Processing - November 2017 - 143
Signal Processing - November 2017 - 144
Signal Processing - November 2017 - 145
Signal Processing - November 2017 - 146
Signal Processing - November 2017 - 147
Signal Processing - November 2017 - 148
Signal Processing - November 2017 - 149
Signal Processing - November 2017 - 150
Signal Processing - November 2017 - 151
Signal Processing - November 2017 - 152
Signal Processing - November 2017 - 153
Signal Processing - November 2017 - 154
Signal Processing - November 2017 - 155
Signal Processing - November 2017 - 156
Signal Processing - November 2017 - 157
Signal Processing - November 2017 - 158
Signal Processing - November 2017 - 159
Signal Processing - November 2017 - 160
Signal Processing - November 2017 - 161
Signal Processing - November 2017 - 162
Signal Processing - November 2017 - 163
Signal Processing - November 2017 - 164
Signal Processing - November 2017 - 165
Signal Processing - November 2017 - 166
Signal Processing - November 2017 - 167
Signal Processing - November 2017 - 168
Signal Processing - November 2017 - 169
Signal Processing - November 2017 - 170
Signal Processing - November 2017 - 171
Signal Processing - November 2017 - 172
Signal Processing - November 2017 - 173
Signal Processing - November 2017 - 174
Signal Processing - November 2017 - 175
Signal Processing - November 2017 - 176
Signal Processing - November 2017 - Cover3
Signal Processing - November 2017 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201809
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201807
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201805
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201803
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201801
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0917
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0717
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0517
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0317
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0916
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0716
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0516
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0316
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0915
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0715
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0515
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0315
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0914
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0714
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0514
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0314
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0913
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0713
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0513
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0313
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0912
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0712
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0512
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0312
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0911
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0711
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0511
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0311
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0910
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0710
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0510
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0310
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0909
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0709
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0509
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0309
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1108
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0908
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0708
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0508
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0308
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0108
https://www.nxtbookmedia.com