Signal Processing - July 2016 - 34

feature is one of the most important features for wound gradrepresentation in red/green/blue (RGB) color images. The
ing, so an investment in a more robust color representation
RGB information can be efficiently exploited for automatic
may help the app perform better, although more slowly with
image segmentation using, for example, K-means clustering
a greater energy demand.
family methods.
However, restricting the color information merely to RGB
components is a simple abstraction that dismisses the
3-D computer vision and model
information available within the color object. A color is
Three-dimensional computer vision uses the portability propfully defined by its complete wavelength response, whereerty of smartphones for better modeling. Because the user can
as the RGB color space represents only three wavelengths.
easily move a smartphone, some information about the 3-D
The perceived color also depends on the illumination constructure of the world is captured through the camera (video
dition, viewing angle, and sensor type. As a consequence,
or image sequence) or other (usually add-on) sensors. In the
efficient color image processing requires an adequate
context of health monitoring, 3-D structure and depth informacolor representation.
tion is valuable in several apps. For example, wound depth is
Different color spaces can be used to represent variimportant to assess the condition of a wound, and the quantitaous color components, with different degrees of interdetive measurement of the elevation level of a skin mole is useful
pendency among them. Of the four classic color spaces,
to detect an abnormal skin mole.
the hue/saturation/value (HSV), YC b C r ,
Computer vision systems for 3-D scene
reconstruction
can be divided into active
L*a*b*, and RGB, the RGB color space
Three-dimensional computer
and passive approaches based on how the
has the most correlated components,
vision uses the portability
range (distance) information is captured.
whereas the YC b C r color components
property of smartphones
Active approaches, such as structured light,
are less correlated. This allows YC b C r
for better modeling.
time of flight, and light detection and rangto extract uncorrelated components and
ing, use measurements obtained from the
favor the separation of the achromatic
projection of a light beam onto a scene. The distance from
and chromatic parts. L*a*b* was originally designed to
the source to the scene at each pixel is measured by the travel
approximate human vision, with its L component (denotof the light, producing the range (depth) images, in which the
ing the luminance) closely matching human perception of
value of each pixel represents the calibrated distance between
lightness. The color-opponent components are represented
the camera and the captured scene. The benefits of using
by a and b. One of the most important attributes of the
active systems are that they are less sensitive to illumination
L*a*b* model is device independence. This means that the
changes and provide range at every pixel location. However,
colors are defined independently of their nature of creation
the required equipment is normally available through add-ons,
or the device on which they are displayed. However, as the
such as Structure Sensor (www.structure.io), and dramatically
L*a*b* color space is much larger than that represented by
increase the cost and energy consumption of the app.
RGB, it requires more data per pixel to reach the same preIn contrast, passive approaches focus on the collection of
cision as RGB. In contrast, the HSV color space is between
images acquired by RGB cameras. One of the most prominent
RGB and L*a*b* in terms of balancing complexity and
is the structure for motion (SfM), which uses the texture and
perceptual correctness. The HSV separates color into three
color information of these images to find the corresponding 3-D
components: two chromatic (hue and saturation) and one
point cloud and reconstructs the original scene. Although these
achromatic (value).
methods do not require additional hardware, the main drawBased on the targeted app and the effect of color on compobacks are ambient light sensitivity and computational expense.
nents such as segmentation and automated decision making, a
Unlike for large-scale structures such as buildings, passuitable color space can be selected. However, although most
sive range estimation techniques have not been studied
smartphone camera apps store the image in the RGB color
extensively for small objects, including wound beds or skin
space, the forward and backward transforms between RGB,
moles. Therefore, we conducted an experiment to determine
HSV, and L*a*b* are not linear. One solution to the problem
the accuracy of some representative passive range techniques
of conversion between different color spaces is to use a raw
for small objects. Figure 2 shows a reconstruction of the 3-D
image format, but this solution adds extra overhead in terms of
structure of a wound. We used a clay wound model compacomputation and storage.
rable in size to a typical chronic wound (e.g., a foot ulcer)
In the end, similar to feature selection, color space selecand incremental SfM [31]. Specifically, to obtain the 3-D
tion is a time-consuming process that should be performed
reconstruction of the wound model using multiple photos
offline, and the decision to choose one space over the othcaptured from different viewpoints, the algorithm iteratively
ers should be based on the importance of color features and
performs several tasks: feature computation, local corretheir robustness against image acquisition variations. For
spondence matching, and fundamental matrix calculation.
example, in the case of skin cancer detection, color feaA bundle refinement with self-calibration was employed to
tures may not be as important as morphological features,
compute the interior orientation with auxiliary parameters.
and therefore, color normalization is not vitally important.
We used scale-invariant feature transform features, which
In contrast, for an app such as wound assessment, the color
34

IEEE SIgnal ProcESSIng MagazInE

|

July 2016

|


http://www.structure.io

Table of Contents for the Digital Edition of Signal Processing - July 2016

Signal Processing - July 2016 - Cover1
Signal Processing - July 2016 - Cover2
Signal Processing - July 2016 - 1
Signal Processing - July 2016 - 2
Signal Processing - July 2016 - 3
Signal Processing - July 2016 - 4
Signal Processing - July 2016 - 5
Signal Processing - July 2016 - 6
Signal Processing - July 2016 - 7
Signal Processing - July 2016 - 8
Signal Processing - July 2016 - 9
Signal Processing - July 2016 - 10
Signal Processing - July 2016 - 11
Signal Processing - July 2016 - 12
Signal Processing - July 2016 - 13
Signal Processing - July 2016 - 14
Signal Processing - July 2016 - 15
Signal Processing - July 2016 - 16
Signal Processing - July 2016 - 17
Signal Processing - July 2016 - 18
Signal Processing - July 2016 - 19
Signal Processing - July 2016 - 20
Signal Processing - July 2016 - 21
Signal Processing - July 2016 - 22
Signal Processing - July 2016 - 23
Signal Processing - July 2016 - 24
Signal Processing - July 2016 - 25
Signal Processing - July 2016 - 26
Signal Processing - July 2016 - 27
Signal Processing - July 2016 - 28
Signal Processing - July 2016 - 29
Signal Processing - July 2016 - 30
Signal Processing - July 2016 - 31
Signal Processing - July 2016 - 32
Signal Processing - July 2016 - 33
Signal Processing - July 2016 - 34
Signal Processing - July 2016 - 35
Signal Processing - July 2016 - 36
Signal Processing - July 2016 - 37
Signal Processing - July 2016 - 38
Signal Processing - July 2016 - 39
Signal Processing - July 2016 - 40
Signal Processing - July 2016 - 41
Signal Processing - July 2016 - 42
Signal Processing - July 2016 - 43
Signal Processing - July 2016 - 44
Signal Processing - July 2016 - 45
Signal Processing - July 2016 - 46
Signal Processing - July 2016 - 47
Signal Processing - July 2016 - 48
Signal Processing - July 2016 - 49
Signal Processing - July 2016 - 50
Signal Processing - July 2016 - 51
Signal Processing - July 2016 - 52
Signal Processing - July 2016 - 53
Signal Processing - July 2016 - 54
Signal Processing - July 2016 - 55
Signal Processing - July 2016 - 56
Signal Processing - July 2016 - 57
Signal Processing - July 2016 - 58
Signal Processing - July 2016 - 59
Signal Processing - July 2016 - 60
Signal Processing - July 2016 - 61
Signal Processing - July 2016 - 62
Signal Processing - July 2016 - 63
Signal Processing - July 2016 - 64
Signal Processing - July 2016 - 65
Signal Processing - July 2016 - 66
Signal Processing - July 2016 - 67
Signal Processing - July 2016 - 68
Signal Processing - July 2016 - 69
Signal Processing - July 2016 - 70
Signal Processing - July 2016 - 71
Signal Processing - July 2016 - 72
Signal Processing - July 2016 - 73
Signal Processing - July 2016 - 74
Signal Processing - July 2016 - 75
Signal Processing - July 2016 - 76
Signal Processing - July 2016 - 77
Signal Processing - July 2016 - 78
Signal Processing - July 2016 - 79
Signal Processing - July 2016 - 80
Signal Processing - July 2016 - 81
Signal Processing - July 2016 - 82
Signal Processing - July 2016 - 83
Signal Processing - July 2016 - 84
Signal Processing - July 2016 - 85
Signal Processing - July 2016 - 86
Signal Processing - July 2016 - 87
Signal Processing - July 2016 - 88
Signal Processing - July 2016 - 89
Signal Processing - July 2016 - 90
Signal Processing - July 2016 - 91
Signal Processing - July 2016 - 92
Signal Processing - July 2016 - 93
Signal Processing - July 2016 - 94
Signal Processing - July 2016 - 95
Signal Processing - July 2016 - 96
Signal Processing - July 2016 - 97
Signal Processing - July 2016 - 98
Signal Processing - July 2016 - 99
Signal Processing - July 2016 - 100
Signal Processing - July 2016 - 101
Signal Processing - July 2016 - 102
Signal Processing - July 2016 - 103
Signal Processing - July 2016 - 104
Signal Processing - July 2016 - Cover3
Signal Processing - July 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201809
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201807
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201805
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201803
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201801
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0917
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0717
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0517
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0317
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0916
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0716
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0516
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0316
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0915
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0715
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0515
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0315
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0914
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0714
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0514
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0314
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0913
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0713
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0513
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0313
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0912
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0712
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0512
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0312
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0911
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0711
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0511
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0311
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0910
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0710
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0510
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0310
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0909
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0709
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0509
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0309
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1108
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0908
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0708
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0508
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0308
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0108
https://www.nxtbookmedia.com