Signal Processing - March 2017 - 84

system can significantly improve the interactive display usability
in vehicles and reduce the effort (attention) they require. Assuming that the prediction certainty meets a set criterion, the user need
not touch the display surface to select the intended on-screen item,
allowing midair selection. Therefore, this solution can also enable
interaction with displays that do not have a physical surface, e.g.,
HUD and 3-D displays or projections.
This article highlights and gives a unified treatment of the various signal processing (e.g., tracking-filtering, fusion, prediction,
etc.) and human factors (e.g., feedback, prior experience, etc.)
challenges posed by the in-vehicle intent-aware display concept,
some of which were individually considered in previous publications (including those for nonautomotive applications), such
as [10]-[19]. In particular, the fundamental problem of intent
inference within a Bayesian framework is addressed here, and
suitable probabilistic prediction models are presented; they lead
to a low-complexity implementation of the inference routine.
Within this formulation, the task of smoothing perturbed pointing trajectories due to road and driving conditions via statistical
filtering is discussed. The sensory requirements of the predictive system in the vehicle environment are also briefly outlined.
Data collected in instrumented cars and results from a prototype
predictive touch-screen system are shown to demonstrate the
capabilities of this intelligent HMI solution.

Background

Angle to Target (°)

According to the renowned human movement model Fitts'
law [20], the index of difficulty (ID) and total time (T ) of

175
150
125
100
75
50
25
0

0

10 20 30 40 50 60 70 80 90 100
Pointing Task Time (%)
(a)

Velocity (mm/s)

1,200
1,000
800
600
400
200
0

0

10 20 30 40 50 60 70 80 90 100
Pointing Task Time (%)
(b)

Figure 2. The angle to on-display icon and velocity profile for 30 in-car
pointing tasks; the thick red line is the mean. (a) The angle to the intended GUI icon i k _ + ^Yk - Yk - 1, d I h . (b) The pointing fingertip velocity
|| Yk - Yk - 1 | | 2 .

84

acquiring an interface icon (i.e., pointing and selection) are
given by
ID = log 2 ^1 + ,/W h,
T = a + b log 2 ^1 + ,/W h,

(1)

where W and , are the the width of the target item and its
distance from the starting position of the pointing object
(mouse cursor or pointing finger), respectively [12]; a and
b are empirically estimated. As intuitively expected, the
selection task can be simplified and expedited by applying a
pointing facilitation scheme, such as increasing the item
size (larger W ) or moving it closer to the cursor (smaller ,).
Since a typical graphical user interface (GUI) contains several selectable items, any assistive pointing strategy should
be preceded by a predictor to identify the intended onscreen icon [12]. Hence, the end-point prediction problem
has received notable attention in the human-computer
interaction (HCI) area, e.g., [11]-[14] (see [10] and [14] for a
brief overview).
The majority of existing HCI studies focus on pointing in
two-dimensional/dimensions (2-D) via a mouse or mechanical device on a computer screen to acquire GUI icons. They
often use deterministic pointing kinematics models for endpoint prediction assuming 1) the pointing object (cursor)
velocity has a consistent profile and is zero at arrival at destination, and 2) the cursor heads at a nearly constant angle
toward its end point. Both premises make intuitive sense for
mouse pointing in 2-D, however, they do not necessarily hold
for freehand pointing gestures in 3-D [10]. For example, Figure 2(a) shows that the pointing fingertip heading angle to an
on-screen icon drastically changes throughout a sample of
freehand pointing gestures recorded in an instrumented car;
d I is the location of the intended on-screen destination in 3-D
and Yk is the 3-D Cartesian coordinates of the pointing fingertip at the time instant t k .
Data-driven prediction techniques, such as in [13] and
[19], can be applied to infer the intended destination of a
pointing task. They often utilize a pointing motion model
learnt from a priori recorded interactions, necessitating the
availability of a complete data set of training examples of
pointing trajectories. This requirement is particularly stringent for freehand gestures approaching a display in 3-D to
select icons on GUIs of various possible layouts, due to the
very large number of possible paths. Additionally, in an
automotive HMI context, a user might be expected to only
undertake a few pointing gestures, for instance, to set up
the IVIS preferences, during his or her first system use, i.e.,
a very limited set of training tracks is often available. On
the other hand, the predictive display system discussed here
employs known motion as well as sensor models, and thus
can use a state-space-modeling approach, albeit with a few
unknown parameters. It requires minimal training and is
computationally efficient.
In the area of object tracking, e.g., in surveillance applications, knowing the destination of a tracked object not only leads

IEEE SIgnal ProcESSIng MagazInE

|

March 2017

|



Table of Contents for the Digital Edition of Signal Processing - March 2017

Signal Processing - March 2017 - Cover1
Signal Processing - March 2017 - Cover2
Signal Processing - March 2017 - 1
Signal Processing - March 2017 - 2
Signal Processing - March 2017 - 3
Signal Processing - March 2017 - 4
Signal Processing - March 2017 - 5
Signal Processing - March 2017 - 6
Signal Processing - March 2017 - 7
Signal Processing - March 2017 - 8
Signal Processing - March 2017 - 9
Signal Processing - March 2017 - 10
Signal Processing - March 2017 - 11
Signal Processing - March 2017 - 12
Signal Processing - March 2017 - 13
Signal Processing - March 2017 - 14
Signal Processing - March 2017 - 15
Signal Processing - March 2017 - 16
Signal Processing - March 2017 - 17
Signal Processing - March 2017 - 18
Signal Processing - March 2017 - 19
Signal Processing - March 2017 - 20
Signal Processing - March 2017 - 21
Signal Processing - March 2017 - 22
Signal Processing - March 2017 - 23
Signal Processing - March 2017 - 24
Signal Processing - March 2017 - 25
Signal Processing - March 2017 - 26
Signal Processing - March 2017 - 27
Signal Processing - March 2017 - 28
Signal Processing - March 2017 - 29
Signal Processing - March 2017 - 30
Signal Processing - March 2017 - 31
Signal Processing - March 2017 - 32
Signal Processing - March 2017 - 33
Signal Processing - March 2017 - 34
Signal Processing - March 2017 - 35
Signal Processing - March 2017 - 36
Signal Processing - March 2017 - 37
Signal Processing - March 2017 - 38
Signal Processing - March 2017 - 39
Signal Processing - March 2017 - 40
Signal Processing - March 2017 - 41
Signal Processing - March 2017 - 42
Signal Processing - March 2017 - 43
Signal Processing - March 2017 - 44
Signal Processing - March 2017 - 45
Signal Processing - March 2017 - 46
Signal Processing - March 2017 - 47
Signal Processing - March 2017 - 48
Signal Processing - March 2017 - 49
Signal Processing - March 2017 - 50
Signal Processing - March 2017 - 51
Signal Processing - March 2017 - 52
Signal Processing - March 2017 - 53
Signal Processing - March 2017 - 54
Signal Processing - March 2017 - 55
Signal Processing - March 2017 - 56
Signal Processing - March 2017 - 57
Signal Processing - March 2017 - 58
Signal Processing - March 2017 - 59
Signal Processing - March 2017 - 60
Signal Processing - March 2017 - 61
Signal Processing - March 2017 - 62
Signal Processing - March 2017 - 63
Signal Processing - March 2017 - 64
Signal Processing - March 2017 - 65
Signal Processing - March 2017 - 66
Signal Processing - March 2017 - 67
Signal Processing - March 2017 - 68
Signal Processing - March 2017 - 69
Signal Processing - March 2017 - 70
Signal Processing - March 2017 - 71
Signal Processing - March 2017 - 72
Signal Processing - March 2017 - 73
Signal Processing - March 2017 - 74
Signal Processing - March 2017 - 75
Signal Processing - March 2017 - 76
Signal Processing - March 2017 - 77
Signal Processing - March 2017 - 78
Signal Processing - March 2017 - 79
Signal Processing - March 2017 - 80
Signal Processing - March 2017 - 81
Signal Processing - March 2017 - 82
Signal Processing - March 2017 - 83
Signal Processing - March 2017 - 84
Signal Processing - March 2017 - 85
Signal Processing - March 2017 - 86
Signal Processing - March 2017 - 87
Signal Processing - March 2017 - 88
Signal Processing - March 2017 - 89
Signal Processing - March 2017 - 90
Signal Processing - March 2017 - 91
Signal Processing - March 2017 - 92
Signal Processing - March 2017 - 93
Signal Processing - March 2017 - 94
Signal Processing - March 2017 - 95
Signal Processing - March 2017 - 96
Signal Processing - March 2017 - 97
Signal Processing - March 2017 - 98
Signal Processing - March 2017 - 99
Signal Processing - March 2017 - 100
Signal Processing - March 2017 - 101
Signal Processing - March 2017 - 102
Signal Processing - March 2017 - 103
Signal Processing - March 2017 - 104
Signal Processing - March 2017 - 105
Signal Processing - March 2017 - 106
Signal Processing - March 2017 - 107
Signal Processing - March 2017 - 108
Signal Processing - March 2017 - 109
Signal Processing - March 2017 - 110
Signal Processing - March 2017 - 111
Signal Processing - March 2017 - 112
Signal Processing - March 2017 - 113
Signal Processing - March 2017 - 114
Signal Processing - March 2017 - 115
Signal Processing - March 2017 - 116
Signal Processing - March 2017 - 117
Signal Processing - March 2017 - 118
Signal Processing - March 2017 - 119
Signal Processing - March 2017 - 120
Signal Processing - March 2017 - 121
Signal Processing - March 2017 - 122
Signal Processing - March 2017 - 123
Signal Processing - March 2017 - 124
Signal Processing - March 2017 - Cover3
Signal Processing - March 2017 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201809
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201807
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201805
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201803
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_201801
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0917
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0717
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0517
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0317
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0117
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0916
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0716
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0516
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0316
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0116
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0915
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0715
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0515
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0315
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0115
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0914
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0714
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0514
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0314
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0114
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0913
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0713
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0513
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0313
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0113
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0912
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0712
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0512
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0312
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0112
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0911
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0711
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0511
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0311
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0111
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0910
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0710
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0510
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0310
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0110
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0909
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0709
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0509
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0309
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0109
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_1108
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0908
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0708
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0508
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0308
https://www.nxtbook.com/nxtbooks/ieee/signalprocessing_0108
https://www.nxtbookmedia.com