Signal Processing - September 2016 - 60
features that light field cameras offer: post-capture refocus,
change of view point, three-dimensional (3-D) data extraction,
change of focal length, focusing through occluders, increasing
visibility in bad weather conditions, and improving the robustness of robot navigation, to name just a few.
In optical design terms, light field imaging presents an (as of
yet unfinished) revolution. Since Gauss's day, optical designers
have been thinking in terms of two conjugate planes, the task of
the designer being to optimize a lens system to gather the light
originating at a point on the object plane and converge it as well
as possible to a point on the image plane. The larger the bundle of
rays that can be converged accurately, the more light-efficient the
capture process becomes and the higher the achievable optical
resolution. The requirement of light-efficient capture introduces
focus into the captured images, i.e., only objects within the focal
plane appear sharp. Light field imaging does away with most of
these concepts, purposefully imaging out-of-focus regions and
inherently aiming at capturing the full 3-D content of a scene.
In terms of signal processing, we encounter a high-dimensional sampling problem with nonuniform and nonlinear
sample spacing and high-dimensional spatio-directionally
varying observation/sampling kernels. The light field data,
however, have particular structures that can be exploited for
analysis and reconstruction. This results from the fact that
scene geometry and reflectance link the information contained in different samples. It also distinguishes the reconstruction problem from a classical signal processing task.
On the software side, we witness the convergence of ideas
from image processing, computer vision, and computer graphics.
In particular, the classical preprocessing tasks of demosaicking,
vignetting compensation, undistortion, and color enhancement
are all affected by sampling in four dimensions rather than in two.
Additionally, image analysis by means of computer vision techniques becomes an integral part of the imaging process. Depthextraction and superresolution techniques enhance the data and
mitigate the inherent resolution tradeoff introduced by sampling
two additional dimensions. A careful system calibration is necessary for good performance. Computer graphics ideas, finally, are
needed to synthesize the images ultimately presented to the user.
This article aims to review of the principles of light field
imaging and associated processing concepts, while simultaneously illuminating the remaining challenges. The presentation
roughly follows the acquisition and processing chain from optical acquisition principles to the final rendered output image.
The focus is on single-camera snapshot technologies that are
currently seeing a significant commercial interest.
Background
This section, which provides background for the rest of the article, closely follows the development in [2]. An extended discussion at an introductory level can be found, e.g., in [4]. A wider
perspective on computational cameras is given in [5] and [6].
Plenoptic function
The theoretical background for light field imaging is the plenoptic function [7], which is a ray-optical concept that assigns
60
a radiance value to rays propagating within a physical space.
It considers the usual 3-D space to be penetrated by light
that propagates in all directions. In doing so, the light can be
blocked, attenuated, or scattered.
However, instead of modeling this complexity as, e.g., computer graphics is doing, the plenoptic function is an unphysical,
modelless, purely phenomenological description of the light
distribution in the space. To accommodate for all the possible
variations of light without referring to an underlying model, it
adopts a high-dimensional description: arbitrary radiance values can be assigned at every position of space, for every possible propagation direction, for every wavelength, and for every
point in time. This is usually denoted as l m (x, y, z, i, z, m, t) ,
where l m [W/m 2 / sr / nm / s] describes spectral radiance per unit
time, (x, y, z) is a spatial position, (i, z) is an incident direction,
m is the wavelength of light, and t is a temporal instance.
The plenoptic function is mostly of conceptual interest.
From a physical perspective, the function cannot be an arbitrary seven-dimensional function because, e.g., radiant flux is
delivered in quantized units, i.e., photons. Therefore, a timeaverage must be assumed. Similarly, it is not possible to measure infinitely thin pencils of rays (i.e., perfect directions) or
even very detailed spatial light distributions without encountering wave effects. We may, therefore, assume that the measurable function is band-limited and that we are restricted to
macroscopic settings where the structures of interest are significantly larger than the wavelength of light.
Light fields
Light fields derive from the plenoptic function by introducing
additional constraints:
■ They are considered to be static even though video light
fields have been explored [8] and are becoming increasingly feasible. An integration over the exposure period
removes the temporal dimension of the plenoptic function.
■ They are typically considered as being monochromatic,
even though the same reasoning is applied to the color
channels independently. An integration over the spectral
sensitivity of the camera pixels removes the spectral
dimension of the plenoptic function.
■ Most importantly, the so called "free-space" assumption
introduces a correlation between spatial positions. Rays are
assumed to propagate through a vacuum without objects,
except for those contained in an "inside" region of the
space, often called a scene. Without a medium and without
occluding objects, the radiance is constant along the rays in
the "outside" region. This removes one additional dimension from the plenoptic function [2].
A light field is, therefore, a four-dimensional (4-D) function.
We may assume the presence of a boundary surface S separating the space into the inside part (i.e., the space region containing
the scene of interest) and the outside part, where the acquisition
apparatus is located. The outside is assumed to be empty space.
Then, the light field is a scalar-valued function of S # S 2+ ,
where S 2+ is the hemisphere of directions toward the outside.
This definition of a light field is also applied to the term surface
IEEE SIgnal ProcESSIng MagazInE
|
September 2016
|
Table of Contents for the Digital Edition of Signal Processing - September 2016
Signal Processing - September 2016 - Cover1
Signal Processing - September 2016 - Cover2
Signal Processing - September 2016 - 1
Signal Processing - September 2016 - 2
Signal Processing - September 2016 - 3
Signal Processing - September 2016 - 4
Signal Processing - September 2016 - 5
Signal Processing - September 2016 - 6
Signal Processing - September 2016 - 7
Signal Processing - September 2016 - 8
Signal Processing - September 2016 - 9
Signal Processing - September 2016 - 10
Signal Processing - September 2016 - 11
Signal Processing - September 2016 - 12
Signal Processing - September 2016 - 13
Signal Processing - September 2016 - 14
Signal Processing - September 2016 - 15
Signal Processing - September 2016 - 16
Signal Processing - September 2016 - 17
Signal Processing - September 2016 - 18
Signal Processing - September 2016 - 19
Signal Processing - September 2016 - 20
Signal Processing - September 2016 - 21
Signal Processing - September 2016 - 22
Signal Processing - September 2016 - 23
Signal Processing - September 2016 - 24
Signal Processing - September 2016 - 25
Signal Processing - September 2016 - 26
Signal Processing - September 2016 - 27
Signal Processing - September 2016 - 28
Signal Processing - September 2016 - 29
Signal Processing - September 2016 - 30
Signal Processing - September 2016 - 31
Signal Processing - September 2016 - 32
Signal Processing - September 2016 - 33
Signal Processing - September 2016 - 34
Signal Processing - September 2016 - 35
Signal Processing - September 2016 - 36
Signal Processing - September 2016 - 37
Signal Processing - September 2016 - 38
Signal Processing - September 2016 - 39
Signal Processing - September 2016 - 40
Signal Processing - September 2016 - 41
Signal Processing - September 2016 - 42
Signal Processing - September 2016 - 43
Signal Processing - September 2016 - 44
Signal Processing - September 2016 - 45
Signal Processing - September 2016 - 46
Signal Processing - September 2016 - 47
Signal Processing - September 2016 - 48
Signal Processing - September 2016 - 49
Signal Processing - September 2016 - 50
Signal Processing - September 2016 - 51
Signal Processing - September 2016 - 52
Signal Processing - September 2016 - 53
Signal Processing - September 2016 - 54
Signal Processing - September 2016 - 55
Signal Processing - September 2016 - 56
Signal Processing - September 2016 - 57
Signal Processing - September 2016 - 58
Signal Processing - September 2016 - 59
Signal Processing - September 2016 - 60
Signal Processing - September 2016 - 61
Signal Processing - September 2016 - 62
Signal Processing - September 2016 - 63
Signal Processing - September 2016 - 64
Signal Processing - September 2016 - 65
Signal Processing - September 2016 - 66
Signal Processing - September 2016 - 67
Signal Processing - September 2016 - 68
Signal Processing - September 2016 - 69
Signal Processing - September 2016 - 70
Signal Processing - September 2016 - 71
Signal Processing - September 2016 - 72
Signal Processing - September 2016 - 73
Signal Processing - September 2016 - 74
Signal Processing - September 2016 - 75
Signal Processing - September 2016 - 76
Signal Processing - September 2016 - 77
Signal Processing - September 2016 - 78
Signal Processing - September 2016 - 79
Signal Processing - September 2016 - 80
Signal Processing - September 2016 - 81
Signal Processing - September 2016 - 82
Signal Processing - September 2016 - 83
Signal Processing - September 2016 - 84
Signal Processing - September 2016 - 85
Signal Processing - September 2016 - 86
Signal Processing - September 2016 - 87
Signal Processing - September 2016 - 88
Signal Processing - September 2016 - 89
Signal Processing - September 2016 - 90
Signal Processing - September 2016 - 91
Signal Processing - September 2016 - 92
Signal Processing - September 2016 - 93
Signal Processing - September 2016 - 94
Signal Processing - September 2016 - 95
Signal Processing - September 2016 - 96
Signal Processing - September 2016 - 97
Signal Processing - September 2016 - 98
Signal Processing - September 2016 - 99
Signal Processing - September 2016 - 100
Signal Processing - September 2016 - 101
Signal Processing - September 2016 - 102
Signal Processing - September 2016 - 103
Signal Processing - September 2016 - 104
Signal Processing - September 2016 - 105
Signal Processing - September 2016 - 106
Signal Processing - September 2016 - 107
Signal Processing - September 2016 - 108
Signal Processing - September 2016 - 109
Signal Processing - September 2016 - 110
Signal Processing - September 2016 - 111
Signal Processing - September 2016 - 112
Signal Processing - September 2016 - 113
Signal Processing - September 2016 - 114
Signal Processing - September 2016 - 115
Signal Processing - September 2016 - 116
Signal Processing - September 2016 - 117
Signal Processing - September 2016 - 118
Signal Processing - September 2016 - 119
Signal Processing - September 2016 - 120
Signal Processing - September 2016 - 121
Signal Processing - September 2016 - 122
Signal Processing - September 2016 - 123
Signal Processing - September 2016 - 124
Signal Processing - September 2016 - 125
Signal Processing - September 2016 - 126
Signal Processing - September 2016 - 127
Signal Processing - September 2016 - 128
Signal Processing - September 2016 - 129
Signal Processing - September 2016 - 130
Signal Processing - September 2016 - 131
Signal Processing - September 2016 - 132
Signal Processing - September 2016 - 133
Signal Processing - September 2016 - 134
Signal Processing - September 2016 - 135
Signal Processing - September 2016 - 136
Signal Processing - September 2016 - 137
Signal Processing - September 2016 - 138
Signal Processing - September 2016 - 139
Signal Processing - September 2016 - 140
Signal Processing - September 2016 - 141
Signal Processing - September 2016 - 142
Signal Processing - September 2016 - 143
Signal Processing - September 2016 - 144
Signal Processing - September 2016 - 145
Signal Processing - September 2016 - 146
Signal Processing - September 2016 - 147
Signal Processing - September 2016 - 148
Signal Processing - September 2016 - 149
Signal Processing - September 2016 - 150
Signal Processing - September 2016 - 151
Signal Processing - September 2016 - 152
Signal Processing - September 2016 - 153
Signal Processing - September 2016 - 154
Signal Processing - September 2016 - 155
Signal Processing - September 2016 - 156
Signal Processing - September 2016 - 157
Signal Processing - September 2016 - 158
Signal Processing - September 2016 - 159
Signal Processing - September 2016 - 160
Signal Processing - September 2016 - 161
Signal Processing - September 2016 - 162
Signal Processing - September 2016 - 163
Signal Processing - September 2016 - 164
Signal Processing - September 2016 - 165
Signal Processing - September 2016 - 166
Signal Processing - September 2016 - 167
Signal Processing - September 2016 - 168
Signal Processing - September 2016 - 169
Signal Processing - September 2016 - 170
Signal Processing - September 2016 - 171
Signal Processing - September 2016 - 172
Signal Processing - September 2016 - 173
Signal Processing - September 2016 - 174
Signal Processing - September 2016 - 175
Signal Processing - September 2016 - 176
Signal Processing - September 2016 - Cover3
Signal Processing - September 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201809
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201807
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201805
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201803
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201801
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0917
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0717
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0517
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0317
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0916
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0716
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0516
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0316
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0915
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0715
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0515
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0315
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0914
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0714
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0514
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0314
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0913
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0713
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0513
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0313
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0912
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0712
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0512
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0312
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0911
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0711
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0511
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0311
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0910
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0710
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0510
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0310
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0909
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0709
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0509
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0309
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1108
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0908
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0708
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0508
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0308
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0108
https://www.nxtbookmedia.com