Signal Processing - May 2017 - 64

The combination of stereo and structured light has other
[28] can be inherited. While single-frame depth reconstrucforms. To achieve high-accuracy, low-delay depth sensing,
tion is supported simultaneously, the phase ambiguity can be
Weise et al. built a novel depth camera based on three-step
eliminated through patch-based image matching. These advanphase shifting, where the phase ambiguity is solved by stereo
tages are beyond the capabilities of original phase shifting using
[39]. Their system consists of a DLP projector, two highgrayscale patterns.
speed monochrome cameras, and a color camera. The two
Moreover, the density-modulated binary patterns can be
monochrome cameras are synchronized and record the three
generated using low-cost laser-diffuser emitters as is done in
images of phase-shifted fringes. As mentioned previously,
the Kinect, which gets rid of expensive DLP projectors. This
the phase ambiguity is an integer disparity
property is extremely important for develwithin [0, M - 1], where M is the number
oping commodity depth cameras. FigA common insight behind
ure 10(a) shows the coded optical element
of periods of the fringe pattern. While stemost of the techniques
customized for the usage of the diffuser. It
reo matching can be performed between
discussed previously is to
is done on an opaque base with a 25-mm
the two monochrome cameras to solve the
improve the efficiency of
diameter. The pattern is in the center of the
ambiguity, the number of possible disparity
space-time multiplexing
base with a 15-mm length in each dimension.
is limited to M. Therefore, no dense stereo
in terms of the light
The size of a bright dot is 20 nm # 20 nm.
matching is needed, which allows for a fast
implementation. To increase the robustness
Figure 10(b) shows the depth camera prosignal usage.
to noise, specularities, and occlusions, an
totype on the fly. Since the binary pattern
optimization algorithm using loopy belief propagation folis fixed on the diffuser in front of the laser, three lasers are
lowed by a consistency check is proposed to reduce the phaseneeded to be aligned with three properly shifted patterns. A
unwrapping error. Meanwhile, a motion compensation method
timing circuit then controls the three lasers to be on and off
is proposed to reduce the motion error. The resulting system
sequentially, together with a synchronized camera. Depth can
can give accurate depth measurement of complex dynamic
be reconstructed from either a single frame or three consecuscenes at 17 fps with the assistance of a GPU. This stereotive frames, depending on the motion detected in the scene.
assisted phase-unwrapping idea is further extended in [40] by
Note that the physical size of the laser-diffuser emitter can be
enforcing viewpoint and temporal consistencies.
much more compact in actual production. Besides its low cost
Stereo has also been used to improve the accuracy of ToF
and compact size, the laser emitter also offers high energy and
sensors, since stereo can make use of high-resolution comlarge depth of field that far surpass DLP projectors.
modity cameras, while ToF sensors are limited in resolution.
On the other hand, ToF sensors perform better than stereo
Sensor fusion
on textureless surfaces, while stereo is more reliable on surA depth camera is ideally fast, accurate, and robust.
faces with rich textures that poses difficulties for ToF sensors
Unfortunately, no existing depth camera is perfect on its own.
due to the large variation of scene albedo. In [41], Zhu et al.
A feasible way to enhance the performance of depth cameras
describe a multisensor system with two color cameras and a
is sensor fusion, as different depth sensors have complementaSwissRanger SR3000 ToF sensor. The probability distribury advantages, and a combined device could outperform each
tion functions of depth estimates from each sensor modalsingle component. Due to its passive properties, stereo is easy
ity are fused using a Markov random field model, and belief
to combine with active illumination sensors. For example,
propagation is then applied on the combined data from ToF
structured light illumination has been used to improve the
and stereo to produce enhanced depth estimates through
accuracy of stereo sensors, since structured light can provide
global regularization. Note that the ToF and stereo sensors
distinguishable features for stereo matching on textureless
need to be calibrated into a common Euclidean coordinate
surfaces [38].
system in advance.
Generally, the combination of passive stereo and active
illumination sensors performs better than either alone. But
there is an inherent limitation for stereo: failure on textureless surfaces. In this case, only active illumination sensors can
work. In [42], Zhang et al. propose a novel fusion framework
to combine ToF and phase shifting, which is the first attempt
to combine two active illumination sensors. The basic idea is to
use the coarse, low-resolution depth from ToF to disambiguate
the wrapped, high-resolution depth from phase shifting. The
proposed system is shown in Figure 11, which consists of a sec(a)
(b)
ond-generation ToF Kinect (the Kinect Gen2), an off-the-shelf
DLP projector, and a monochrome camera. Specifically, two
FIGURE 10. (a) The coded optical element customized for the diffuser
key technical issues are addressed in this work. First, both ToF
usage. (b) The depth camera prototype using laser-diffuser emitters to
and phase shifting emit light signals, so they will inevitably
generate density-modulated binary patterns.
64

IEEE SIgnal ProcESSIng MagazInE

|

May 2017

|



Table of Contents for the Digital Edition of Signal Processing - May 2017

Signal Processing - May 2017 - Cover1
Signal Processing - May 2017 - Cover2
Signal Processing - May 2017 - 1
Signal Processing - May 2017 - 2
Signal Processing - May 2017 - 3
Signal Processing - May 2017 - 4
Signal Processing - May 2017 - 5
Signal Processing - May 2017 - 6
Signal Processing - May 2017 - 7
Signal Processing - May 2017 - 8
Signal Processing - May 2017 - 9
Signal Processing - May 2017 - 10
Signal Processing - May 2017 - 11
Signal Processing - May 2017 - 12
Signal Processing - May 2017 - 13
Signal Processing - May 2017 - 14
Signal Processing - May 2017 - 15
Signal Processing - May 2017 - 16
Signal Processing - May 2017 - 17
Signal Processing - May 2017 - 18
Signal Processing - May 2017 - 19
Signal Processing - May 2017 - 20
Signal Processing - May 2017 - 21
Signal Processing - May 2017 - 22
Signal Processing - May 2017 - 23
Signal Processing - May 2017 - 24
Signal Processing - May 2017 - 25
Signal Processing - May 2017 - 26
Signal Processing - May 2017 - 27
Signal Processing - May 2017 - 28
Signal Processing - May 2017 - 29
Signal Processing - May 2017 - 30
Signal Processing - May 2017 - 31
Signal Processing - May 2017 - 32
Signal Processing - May 2017 - 33
Signal Processing - May 2017 - 34
Signal Processing - May 2017 - 35
Signal Processing - May 2017 - 36
Signal Processing - May 2017 - 37
Signal Processing - May 2017 - 38
Signal Processing - May 2017 - 39
Signal Processing - May 2017 - 40
Signal Processing - May 2017 - 41
Signal Processing - May 2017 - 42
Signal Processing - May 2017 - 43
Signal Processing - May 2017 - 44
Signal Processing - May 2017 - 45
Signal Processing - May 2017 - 46
Signal Processing - May 2017 - 47
Signal Processing - May 2017 - 48
Signal Processing - May 2017 - 49
Signal Processing - May 2017 - 50
Signal Processing - May 2017 - 51
Signal Processing - May 2017 - 52
Signal Processing - May 2017 - 53
Signal Processing - May 2017 - 54
Signal Processing - May 2017 - 55
Signal Processing - May 2017 - 56
Signal Processing - May 2017 - 57
Signal Processing - May 2017 - 58
Signal Processing - May 2017 - 59
Signal Processing - May 2017 - 60
Signal Processing - May 2017 - 61
Signal Processing - May 2017 - 62
Signal Processing - May 2017 - 63
Signal Processing - May 2017 - 64
Signal Processing - May 2017 - 65
Signal Processing - May 2017 - 66
Signal Processing - May 2017 - 67
Signal Processing - May 2017 - 68
Signal Processing - May 2017 - 69
Signal Processing - May 2017 - 70
Signal Processing - May 2017 - 71
Signal Processing - May 2017 - 72
Signal Processing - May 2017 - 73
Signal Processing - May 2017 - 74
Signal Processing - May 2017 - 75
Signal Processing - May 2017 - 76
Signal Processing - May 2017 - 77
Signal Processing - May 2017 - 78
Signal Processing - May 2017 - 79
Signal Processing - May 2017 - 80
Signal Processing - May 2017 - 81
Signal Processing - May 2017 - 82
Signal Processing - May 2017 - 83
Signal Processing - May 2017 - 84
Signal Processing - May 2017 - 85
Signal Processing - May 2017 - 86
Signal Processing - May 2017 - 87
Signal Processing - May 2017 - 88
Signal Processing - May 2017 - 89
Signal Processing - May 2017 - 90
Signal Processing - May 2017 - 91
Signal Processing - May 2017 - 92
Signal Processing - May 2017 - 93
Signal Processing - May 2017 - 94
Signal Processing - May 2017 - 95
Signal Processing - May 2017 - 96
Signal Processing - May 2017 - 97
Signal Processing - May 2017 - 98
Signal Processing - May 2017 - 99
Signal Processing - May 2017 - 100
Signal Processing - May 2017 - 101
Signal Processing - May 2017 - 102
Signal Processing - May 2017 - 103
Signal Processing - May 2017 - 104
Signal Processing - May 2017 - 105
Signal Processing - May 2017 - 106
Signal Processing - May 2017 - 107
Signal Processing - May 2017 - 108
Signal Processing - May 2017 - 109
Signal Processing - May 2017 - 110
Signal Processing - May 2017 - 111
Signal Processing - May 2017 - 112
Signal Processing - May 2017 - Cover3
Signal Processing - May 2017 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201809
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201807
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201805
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201803
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201801
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0917
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0717
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0517
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0317
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0916
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0716
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0516
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0316
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0915
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0715
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0515
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0315
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0914
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0714
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0514
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0314
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0913
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0713
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0513
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0313
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0912
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0712
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0512
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0312
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0911
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0711
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0511
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0311
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0910
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0710
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0510
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0310
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0909
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0709
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0509
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0309
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1108
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0908
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0708
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0508
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0308
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0108
https://www.nxtbookmedia.com