IEEE Robotics & Automation Magazine - September 2023 - 70

This article is concerned with the specific
case of training an object-recognition
system using only a priori visual data to
recognize the same object from the tactile
modality, albeit not previously sensed
through the latter. Humans rely on this
visuo-tactile cross modality and therefore
are able to reconstruct a vision-based
representation of objects and recognize
them using sense of touch only. Similarly,
exploiting this type of cross modality in
robotics would allow learning an object
representation in a controlled environment
using vision and deploying the
system in a more challenging scenario
where the vision is not available (e.g.,
manipulation tasks in clutter, or where
the target objects are not directly visible).
The works proposed in [5] refer to this as cross-modal recognition
(CMR) or visuo-tactile recognition (VTR), taking
inspiration from its psychological definitions [6]. Beyond [5],
other examples of a visuo-tactile cross modality can be found
in the literature [7], [8]. The work presented in [7] explored a
visuo-tactile cross modality to generate tactile images from
visual images and vice versa; yet the training of the two systems
required both tactile and visual data. The work in [8]
used vision to estimate the pose of an object and proposed a
Bayesian algorithm with linear Kalman filters to hone that
prediction with each sequential touch.
To the best of our knowledge, only Falco et al. [5] have per "
POINTNET,
A POINTBASED
ARCHITECTURE,
WAS CHOSEN OVER
PRO JECTION- OR
VOLUMETRIC-BASED
METHODS SUCH
AS A MULTIVIEW
CONVOLUTIONAL NNs
AND VOXNET.
„
descriptors adapt based on the training
dataset and formulation of the learning
task, learning geometric relationships
directly from the data. Rather than proposing
a handcrafted descriptor that can
capture local shapes, we utilize the established
point-based architecture PointNet
[10] to extract task-driven shape descriptors.
The work presented in [11] noted
the gap in the research of partial point
cloud recognition and explored the ability
of PointNet to recognize partial and
noisy point clouds. The authors found
that it was vital to expose the network to
partial representations during training. In
this article, we take it a step further by
formulating a learning task with a training
procedure based on CL [12] to foster
learning of local descriptors from sparser and sparse tactile
data, represented as point clouds.
PointNet, a point-based architecture, was chosen over proformed
CMR by training a system with only the visual modality
on a set of quasi-planar rigid objects. They found point
clouds to be a suitable representation to encode visual and
tactile data for this task. The approach enriches the ensemble
of shape functions (ESF) with information from signature of
histograms of orientations (SHOT) to form the cross-modal
point cLoUd dEscriptor (CLUE) and subsequently use a geodesic
flow kernel (GFK) transfer learning technique to increase
cross-modal performance. The limitation of this work is that
the proposed training pipeline, based on an ensemble of global
handcrafted descriptors for point clouds, requires full tactile
exploration of the object to perform the predictions. However,
as tactile exploration is a time-expensive task, this article
focuses on recognizing objects from partial observations and
making predictions that can be iteratively improved as more
data are gathered. The descriptors employed in [5] are global,
requiring the full tactile model of the object, and therefore are
not directly employable when attempting recognition from partial
observations. Conversely, in this article, we investigate the
use of data-driven techniques to learn more task-specific representations.
Therefore, instead of exploiting existing descriptors,
we define the task and allow the proposed learning procedure
to compose the features.
Neural networks (NNs) have been used extensively for
point cloud recognition in past years to statistically learn
point cloud descriptors or shape embeddings [9]. These
70 IEEE ROBOTICS & AUTOMATION MAGAZINE SEPTEMBER 2023
jection- or volumetric-based methods such as a multiview
convolutional NNs (MVCNNs) and VoxNet [13]. Projectionbased
methods rely on a meshing preprocessing that, besides
being computationally expensive, assumes the emptiness of
the unexplored regions; this is undesirable for recognizing
partially explored shapes. Volumetric-based methods, which
instead construct data structures to represent the occupancy
of a 3D grid and enable the use of 3D convolutions, have
been surpassed by point-based networks [9]. Over the last
few years, PointNet has been influential in deep-shape recognition
as several point-based networks incorporate it for
shape-feature extraction [14] as well as for creating encodings
for generative adversarial networks [15]. Although a
dynamic graph CNN [13], an extension of PointNet, would
also be a good choice, our choice fell on the latter as it is
more computationally efficient because it does not require
the computation of graph structures in latent space. In summary,
current works mainly use tactile sensing alone and
require slow-to-collect tactile datasets, without exploiting
the readiness of visual data. Existing works tackle the problem
by defining handcrafted descriptors. Paganoni et al. [11]
instead studied partial point cloud recognition but derived
samples from the ModelNet40 dataset, which is a highquality
and low-noise CAD dataset. They also analyzed the
performance of PointNet under noise without attempting to
improve it. However, as explained in the " Problem Definition "
section, noise and uncertainties, due to representation
differences, are core to the issue of VTR. Furthermore, in
contrast to the partial point clouds that would be generated
during a tactile exploration, the work in [11] used simulated
laser scans and photogrammetry from single viewpoints. On
the contrary, the tactile point clouds may be composed of
sparse and unconnected clusters of points collected from any
given surface of the object. This study explores decomposing
whole point clouds into patchy partial samples to more

IEEE Robotics & Automation Magazine - September 2023

Table of Contents for the Digital Edition of IEEE Robotics & Automation Magazine - September 2023

Contents
IEEE Robotics & Automation Magazine - September 2023 - Cover1
IEEE Robotics & Automation Magazine - September 2023 - Cover2
IEEE Robotics & Automation Magazine - September 2023 - 1
IEEE Robotics & Automation Magazine - September 2023 - Contents
IEEE Robotics & Automation Magazine - September 2023 - 3
IEEE Robotics & Automation Magazine - September 2023 - 4
IEEE Robotics & Automation Magazine - September 2023 - 5
IEEE Robotics & Automation Magazine - September 2023 - 6
IEEE Robotics & Automation Magazine - September 2023 - 7
IEEE Robotics & Automation Magazine - September 2023 - 8
IEEE Robotics & Automation Magazine - September 2023 - 9
IEEE Robotics & Automation Magazine - September 2023 - 10
IEEE Robotics & Automation Magazine - September 2023 - 11
IEEE Robotics & Automation Magazine - September 2023 - 12
IEEE Robotics & Automation Magazine - September 2023 - 13
IEEE Robotics & Automation Magazine - September 2023 - 14
IEEE Robotics & Automation Magazine - September 2023 - 15
IEEE Robotics & Automation Magazine - September 2023 - 16
IEEE Robotics & Automation Magazine - September 2023 - 17
IEEE Robotics & Automation Magazine - September 2023 - 18
IEEE Robotics & Automation Magazine - September 2023 - 19
IEEE Robotics & Automation Magazine - September 2023 - 20
IEEE Robotics & Automation Magazine - September 2023 - 21
IEEE Robotics & Automation Magazine - September 2023 - 22
IEEE Robotics & Automation Magazine - September 2023 - 23
IEEE Robotics & Automation Magazine - September 2023 - 24
IEEE Robotics & Automation Magazine - September 2023 - 25
IEEE Robotics & Automation Magazine - September 2023 - 26
IEEE Robotics & Automation Magazine - September 2023 - 27
IEEE Robotics & Automation Magazine - September 2023 - 28
IEEE Robotics & Automation Magazine - September 2023 - 29
IEEE Robotics & Automation Magazine - September 2023 - 30
IEEE Robotics & Automation Magazine - September 2023 - 31
IEEE Robotics & Automation Magazine - September 2023 - 32
IEEE Robotics & Automation Magazine - September 2023 - 33
IEEE Robotics & Automation Magazine - September 2023 - 34
IEEE Robotics & Automation Magazine - September 2023 - 35
IEEE Robotics & Automation Magazine - September 2023 - 36
IEEE Robotics & Automation Magazine - September 2023 - 37
IEEE Robotics & Automation Magazine - September 2023 - 38
IEEE Robotics & Automation Magazine - September 2023 - 39
IEEE Robotics & Automation Magazine - September 2023 - 40
IEEE Robotics & Automation Magazine - September 2023 - 41
IEEE Robotics & Automation Magazine - September 2023 - 42
IEEE Robotics & Automation Magazine - September 2023 - 43
IEEE Robotics & Automation Magazine - September 2023 - 44
IEEE Robotics & Automation Magazine - September 2023 - 45
IEEE Robotics & Automation Magazine - September 2023 - 46
IEEE Robotics & Automation Magazine - September 2023 - 47
IEEE Robotics & Automation Magazine - September 2023 - 48
IEEE Robotics & Automation Magazine - September 2023 - 49
IEEE Robotics & Automation Magazine - September 2023 - 50
IEEE Robotics & Automation Magazine - September 2023 - 51
IEEE Robotics & Automation Magazine - September 2023 - 52
IEEE Robotics & Automation Magazine - September 2023 - 53
IEEE Robotics & Automation Magazine - September 2023 - 54
IEEE Robotics & Automation Magazine - September 2023 - 55
IEEE Robotics & Automation Magazine - September 2023 - 56
IEEE Robotics & Automation Magazine - September 2023 - 57
IEEE Robotics & Automation Magazine - September 2023 - 58
IEEE Robotics & Automation Magazine - September 2023 - 59
IEEE Robotics & Automation Magazine - September 2023 - 60
IEEE Robotics & Automation Magazine - September 2023 - 61
IEEE Robotics & Automation Magazine - September 2023 - 62
IEEE Robotics & Automation Magazine - September 2023 - 63
IEEE Robotics & Automation Magazine - September 2023 - 64
IEEE Robotics & Automation Magazine - September 2023 - 65
IEEE Robotics & Automation Magazine - September 2023 - 66
IEEE Robotics & Automation Magazine - September 2023 - 67
IEEE Robotics & Automation Magazine - September 2023 - 68
IEEE Robotics & Automation Magazine - September 2023 - 69
IEEE Robotics & Automation Magazine - September 2023 - 70
IEEE Robotics & Automation Magazine - September 2023 - 71
IEEE Robotics & Automation Magazine - September 2023 - 72
IEEE Robotics & Automation Magazine - September 2023 - 73
IEEE Robotics & Automation Magazine - September 2023 - 74
IEEE Robotics & Automation Magazine - September 2023 - 75
IEEE Robotics & Automation Magazine - September 2023 - 76
IEEE Robotics & Automation Magazine - September 2023 - 77
IEEE Robotics & Automation Magazine - September 2023 - 78
IEEE Robotics & Automation Magazine - September 2023 - 79
IEEE Robotics & Automation Magazine - September 2023 - 80
IEEE Robotics & Automation Magazine - September 2023 - 81
IEEE Robotics & Automation Magazine - September 2023 - 82
IEEE Robotics & Automation Magazine - September 2023 - 83
IEEE Robotics & Automation Magazine - September 2023 - 84
IEEE Robotics & Automation Magazine - September 2023 - 85
IEEE Robotics & Automation Magazine - September 2023 - 86
IEEE Robotics & Automation Magazine - September 2023 - 87
IEEE Robotics & Automation Magazine - September 2023 - 88
IEEE Robotics & Automation Magazine - September 2023 - 89
IEEE Robotics & Automation Magazine - September 2023 - 90
IEEE Robotics & Automation Magazine - September 2023 - 91
IEEE Robotics & Automation Magazine - September 2023 - 92
IEEE Robotics & Automation Magazine - September 2023 - 93
IEEE Robotics & Automation Magazine - September 2023 - 94
IEEE Robotics & Automation Magazine - September 2023 - 95
IEEE Robotics & Automation Magazine - September 2023 - 96
IEEE Robotics & Automation Magazine - September 2023 - 97
IEEE Robotics & Automation Magazine - September 2023 - 98
IEEE Robotics & Automation Magazine - September 2023 - 99
IEEE Robotics & Automation Magazine - September 2023 - 100
IEEE Robotics & Automation Magazine - September 2023 - 101
IEEE Robotics & Automation Magazine - September 2023 - 102
IEEE Robotics & Automation Magazine - September 2023 - 103
IEEE Robotics & Automation Magazine - September 2023 - 104
IEEE Robotics & Automation Magazine - September 2023 - 105
IEEE Robotics & Automation Magazine - September 2023 - 106
IEEE Robotics & Automation Magazine - September 2023 - 107
IEEE Robotics & Automation Magazine - September 2023 - 108
IEEE Robotics & Automation Magazine - September 2023 - 109
IEEE Robotics & Automation Magazine - September 2023 - 110
IEEE Robotics & Automation Magazine - September 2023 - 111
IEEE Robotics & Automation Magazine - September 2023 - 112
IEEE Robotics & Automation Magazine - September 2023 - 113
IEEE Robotics & Automation Magazine - September 2023 - 114
IEEE Robotics & Automation Magazine - September 2023 - 115
IEEE Robotics & Automation Magazine - September 2023 - 116
IEEE Robotics & Automation Magazine - September 2023 - 117
IEEE Robotics & Automation Magazine - September 2023 - 118
IEEE Robotics & Automation Magazine - September 2023 - 119
IEEE Robotics & Automation Magazine - September 2023 - 120
IEEE Robotics & Automation Magazine - September 2023 - 121
IEEE Robotics & Automation Magazine - September 2023 - 122
IEEE Robotics & Automation Magazine - September 2023 - 123
IEEE Robotics & Automation Magazine - September 2023 - 124
IEEE Robotics & Automation Magazine - September 2023 - 125
IEEE Robotics & Automation Magazine - September 2023 - 126
IEEE Robotics & Automation Magazine - September 2023 - 127
IEEE Robotics & Automation Magazine - September 2023 - 128
IEEE Robotics & Automation Magazine - September 2023 - 129
IEEE Robotics & Automation Magazine - September 2023 - 130
IEEE Robotics & Automation Magazine - September 2023 - 131
IEEE Robotics & Automation Magazine - September 2023 - 132
IEEE Robotics & Automation Magazine - September 2023 - 133
IEEE Robotics & Automation Magazine - September 2023 - 134
IEEE Robotics & Automation Magazine - September 2023 - 135
IEEE Robotics & Automation Magazine - September 2023 - 136
IEEE Robotics & Automation Magazine - September 2023 - 137
IEEE Robotics & Automation Magazine - September 2023 - 138
IEEE Robotics & Automation Magazine - September 2023 - 139
IEEE Robotics & Automation Magazine - September 2023 - 140
IEEE Robotics & Automation Magazine - September 2023 - 141
IEEE Robotics & Automation Magazine - September 2023 - 142
IEEE Robotics & Automation Magazine - September 2023 - 143
IEEE Robotics & Automation Magazine - September 2023 - 144
IEEE Robotics & Automation Magazine - September 2023 - 145
IEEE Robotics & Automation Magazine - September 2023 - 146
IEEE Robotics & Automation Magazine - September 2023 - 147
IEEE Robotics & Automation Magazine - September 2023 - 148
IEEE Robotics & Automation Magazine - September 2023 - 149
IEEE Robotics & Automation Magazine - September 2023 - 150
IEEE Robotics & Automation Magazine - September 2023 - 151
IEEE Robotics & Automation Magazine - September 2023 - 152
IEEE Robotics & Automation Magazine - September 2023 - 153
IEEE Robotics & Automation Magazine - September 2023 - 154
IEEE Robotics & Automation Magazine - September 2023 - 155
IEEE Robotics & Automation Magazine - September 2023 - 156
IEEE Robotics & Automation Magazine - September 2023 - 157
IEEE Robotics & Automation Magazine - September 2023 - 158
IEEE Robotics & Automation Magazine - September 2023 - 159
IEEE Robotics & Automation Magazine - September 2023 - 160
IEEE Robotics & Automation Magazine - September 2023 - 161
IEEE Robotics & Automation Magazine - September 2023 - 162
IEEE Robotics & Automation Magazine - September 2023 - 163
IEEE Robotics & Automation Magazine - September 2023 - 164
IEEE Robotics & Automation Magazine - September 2023 - 165
IEEE Robotics & Automation Magazine - September 2023 - 166
IEEE Robotics & Automation Magazine - September 2023 - 167
IEEE Robotics & Automation Magazine - September 2023 - 168
IEEE Robotics & Automation Magazine - September 2023 - Cover3
IEEE Robotics & Automation Magazine - September 2023 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2010
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2010
https://www.nxtbookmedia.com