Instrumentation & Measurement Magazine 24-3 - 85

Biologically Inspired Vision and
Touch Sensing to Optimize 3D
Object Representation
and Recognition
Ghazal Rouhafzay, Ana-Maria Cretu, and Pierre Payeur

3

D representation and recognition of objects are two
pivotal steps for autonomous robots to safely explore
and interact with an unknown environment and manipulate objects. 3D modeling can be beneficial in different
robotic applications such as object grasping, pose estimation,
robot navigation and localization. Real-time data acquisition
and accurate object representation are essential in the context
of such practical applications. On the other hand, the recognition of the objects in an environment is indispensable for
situational awareness and for enabling the robot to interact effectively with complex environments.
Robot vision can be considered as the most informative and
reliable sensing modality in autonomous robots. Nevertheless,
vision fails to work properly in a number of situations including low light environments, cases where an object is occluded
or is out of the camera's field of view, and situations in which
objects are not visually distinguishable. Tactile sensing, as an
indispensable element of dexterous robotic manipulation, can
be efficiently integrated with other sensory modalities, in particular with vision, to increase the reliability of an autonomous
robot. It makes available a wide range of information on objects
including surface properties such as roughness, texture, vibration, temperature, local shape, etc., all important features that
can contribute to better identifying an object. Moreover, a combined use of vision and touch in humans was demonstrated to
facilitate manipulation, grasping and handling of objects, and
could therefore be exploited to increase the efficiency of autonomous robots in a variety of tasks. However, visuo-tactile
integration and the creation of efficient computation methods
to help a robot successfully recognize and manipulate the objects it is interacting with remains a challenging issue.
A huge research effort has been invested in the literature
to efficiently integrate the two sensing modalities. Nevertheless, all currently published works tackling visuo-haptic
interaction only use visual data to increase the spatial resolution of tactile data, to resolve conflict situations, such as cases
where the tactile information is faulty, or conjunctly use tactile and visual data to recognize objects. Considering the fact
that the acquisition and processing of tactile data itself is a
May 2021	

time-consuming task, such approaches for visuo-tactile integration are associated with a high computational cost, thus
making them very difficult, if not impossible, to use in realtime interaction scenarios. Alternatively, the sophisticated
cognitive skills of the human brain and its patterns of natural
intelligence have encouraged scientists to develop biologically
inspired computation techniques, bringing automatic processing capabilities to computers and robots.
Referring to biological research, we can draw three main
conclusions about the interaction and collaboration of visual
and haptic sensory modalities: 1) Tactile salient features also
attract visual attention to their location [1]; 2) a combined use
of vision and touch works more efficiently compared to cases
where vision and touch are exploited separately [2]; and 3) visual and tactile object recognition rely on similar processes in
terms of categorization, recognition and representation [2].
These conclusions suggest that visuo-tactile integration is a
promising solution to optimize the process of object modelling.
Moreover, visual data, in the form of salient regions acquired
by a model of visual attention (according to 1) can be employed
to guide the process of tactile data acquisition. Furthermore, visuo-tactile integration can be performed (according to 2 and 3)
at a higher (perception) level based on similarities between the
two sensing modalities. Since collecting large datasets of tactile
data for training a model is a much more complex task compared to visual data, it is expected that a transfer of learning
from vision to touch can both enhance the performance of tactile object recognition and amalgamate visual and tactile data
processing units in robots. This paper presents research initiatives performed by the authors to validate these biologically
inspired assumptions and efficiently merge measurements
from different instrumentation technologies in a framework to
operate in the context of practical robotic tasks that involve 3D
object representation and recognition.

Techniques for Visual and Tactile Data
Acquisition
As illustrated in Fig. 1, the framework makes use of visual
and tactile data acquired over the surface of objects. A variety

IEEE Instrumentation & Measurement Magazine	85
1094-6969/21/$25.00©2021IEEE



Instrumentation & Measurement Magazine 24-3

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 24-3

No label
Instrumentation & Measurement Magazine 24-3 - No label
Instrumentation & Measurement Magazine 24-3 - Cover2
Instrumentation & Measurement Magazine 24-3 - 1
Instrumentation & Measurement Magazine 24-3 - 2
Instrumentation & Measurement Magazine 24-3 - 3
Instrumentation & Measurement Magazine 24-3 - 4
Instrumentation & Measurement Magazine 24-3 - 5
Instrumentation & Measurement Magazine 24-3 - 6
Instrumentation & Measurement Magazine 24-3 - 7
Instrumentation & Measurement Magazine 24-3 - 8
Instrumentation & Measurement Magazine 24-3 - 9
Instrumentation & Measurement Magazine 24-3 - 10
Instrumentation & Measurement Magazine 24-3 - 11
Instrumentation & Measurement Magazine 24-3 - 12
Instrumentation & Measurement Magazine 24-3 - 13
Instrumentation & Measurement Magazine 24-3 - 14
Instrumentation & Measurement Magazine 24-3 - 15
Instrumentation & Measurement Magazine 24-3 - 16
Instrumentation & Measurement Magazine 24-3 - 17
Instrumentation & Measurement Magazine 24-3 - 18
Instrumentation & Measurement Magazine 24-3 - 19
Instrumentation & Measurement Magazine 24-3 - 20
Instrumentation & Measurement Magazine 24-3 - 21
Instrumentation & Measurement Magazine 24-3 - 22
Instrumentation & Measurement Magazine 24-3 - 23
Instrumentation & Measurement Magazine 24-3 - 24
Instrumentation & Measurement Magazine 24-3 - 25
Instrumentation & Measurement Magazine 24-3 - 26
Instrumentation & Measurement Magazine 24-3 - 27
Instrumentation & Measurement Magazine 24-3 - 28
Instrumentation & Measurement Magazine 24-3 - 29
Instrumentation & Measurement Magazine 24-3 - 30
Instrumentation & Measurement Magazine 24-3 - 31
Instrumentation & Measurement Magazine 24-3 - 32
Instrumentation & Measurement Magazine 24-3 - 33
Instrumentation & Measurement Magazine 24-3 - 34
Instrumentation & Measurement Magazine 24-3 - 35
Instrumentation & Measurement Magazine 24-3 - 36
Instrumentation & Measurement Magazine 24-3 - 37
Instrumentation & Measurement Magazine 24-3 - 38
Instrumentation & Measurement Magazine 24-3 - 39
Instrumentation & Measurement Magazine 24-3 - 40
Instrumentation & Measurement Magazine 24-3 - 41
Instrumentation & Measurement Magazine 24-3 - 42
Instrumentation & Measurement Magazine 24-3 - 43
Instrumentation & Measurement Magazine 24-3 - 44
Instrumentation & Measurement Magazine 24-3 - 45
Instrumentation & Measurement Magazine 24-3 - 46
Instrumentation & Measurement Magazine 24-3 - 47
Instrumentation & Measurement Magazine 24-3 - 48
Instrumentation & Measurement Magazine 24-3 - 49
Instrumentation & Measurement Magazine 24-3 - 50
Instrumentation & Measurement Magazine 24-3 - 51
Instrumentation & Measurement Magazine 24-3 - 52
Instrumentation & Measurement Magazine 24-3 - 53
Instrumentation & Measurement Magazine 24-3 - 54
Instrumentation & Measurement Magazine 24-3 - 55
Instrumentation & Measurement Magazine 24-3 - 56
Instrumentation & Measurement Magazine 24-3 - 57
Instrumentation & Measurement Magazine 24-3 - 58
Instrumentation & Measurement Magazine 24-3 - 59
Instrumentation & Measurement Magazine 24-3 - 60
Instrumentation & Measurement Magazine 24-3 - 61
Instrumentation & Measurement Magazine 24-3 - 62
Instrumentation & Measurement Magazine 24-3 - 63
Instrumentation & Measurement Magazine 24-3 - 64
Instrumentation & Measurement Magazine 24-3 - 65
Instrumentation & Measurement Magazine 24-3 - 66
Instrumentation & Measurement Magazine 24-3 - 67
Instrumentation & Measurement Magazine 24-3 - 68
Instrumentation & Measurement Magazine 24-3 - 69
Instrumentation & Measurement Magazine 24-3 - 70
Instrumentation & Measurement Magazine 24-3 - 71
Instrumentation & Measurement Magazine 24-3 - 72
Instrumentation & Measurement Magazine 24-3 - 73
Instrumentation & Measurement Magazine 24-3 - 74
Instrumentation & Measurement Magazine 24-3 - 75
Instrumentation & Measurement Magazine 24-3 - 76
Instrumentation & Measurement Magazine 24-3 - 77
Instrumentation & Measurement Magazine 24-3 - 78
Instrumentation & Measurement Magazine 24-3 - 79
Instrumentation & Measurement Magazine 24-3 - 80
Instrumentation & Measurement Magazine 24-3 - 81
Instrumentation & Measurement Magazine 24-3 - 82
Instrumentation & Measurement Magazine 24-3 - 83
Instrumentation & Measurement Magazine 24-3 - 84
Instrumentation & Measurement Magazine 24-3 - 85
Instrumentation & Measurement Magazine 24-3 - 86
Instrumentation & Measurement Magazine 24-3 - 87
Instrumentation & Measurement Magazine 24-3 - 88
Instrumentation & Measurement Magazine 24-3 - 89
Instrumentation & Measurement Magazine 24-3 - 90
Instrumentation & Measurement Magazine 24-3 - 91
Instrumentation & Measurement Magazine 24-3 - 92
Instrumentation & Measurement Magazine 24-3 - 93
Instrumentation & Measurement Magazine 24-3 - 94
Instrumentation & Measurement Magazine 24-3 - 95
Instrumentation & Measurement Magazine 24-3 - 96
Instrumentation & Measurement Magazine 24-3 - 97
Instrumentation & Measurement Magazine 24-3 - 98
Instrumentation & Measurement Magazine 24-3 - 99
Instrumentation & Measurement Magazine 24-3 - 100
Instrumentation & Measurement Magazine 24-3 - Cover3
Instrumentation & Measurement Magazine 24-3 - Cover4
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com