IEEE Robotics & Automation Magazine - December 2022 - 39

perception to recognize objects [25], map the environment
[26], [27], or avoid obstacles [28] and haptic perception to
actively touch objects [16]. The linking of action to perceive
the information of interest is known as interactive
perception, which is a common human exploration strategy.
For instance, if humans cannot recognize an object by
vision and touch, they will take different interactions to
obtain information from other sensory channels [25].
The usage of smelling and tasting for robots is not popular.
One rare example presents a navigation approach using
smell for odor source localization [29]. A different niche
develops a tasting sense using IR spectroscopic technologies
or chemical sensors to discriminate food [30], [31]. Driven
by computer vision (CV), 2D image processing is the fundamental
pillar for robot recognition tasks. However, 3D
methods for scene and object reconstruction gained importance
for robots. Various surveys provide an overview of the
state-of-the-art robot recognition tasks [32], [33]. Established
robotics approaches focus on specific recognition
subareas. The preattentive recognition of primitives comprises
methods on normal estimation [34], [35], segmentation
[36], [37], edge [38], and simple shape (such as planes
and cylinders) [39] detection. The methods usually make
use of the entire sensory input. However, robots prefilter or
scale the data to achieve the appropriate results on their
capabilities. For instance, image-based feature recognition
can process high-resolution images by making use of image
pyramids [40]. Herewith, the extracted features of a lower
image resolution deliver discrete regions of interest by keeping
the accuracy of the original scale.
Postattentive processing comprises higher-level recognition
capabilities based on preattentive processing. Examples
are extracting the semantic by (re)-detection of objects [41],
[42], humans [43], [44], and places [45]; 3D reconstruction of
the scene by Simultaneous Localization and Mapping
(SLAM) [32], [46], [47]; or enriching metric information by
semantics, known as semantic segmentation [33], [48], [49].
Equal to humans, robots use preprocessed attention
areas within postattentive processing. For instance, featurebased
SLAM methods use sparse feature points to postattentively
register a set of sensor data and to detect loop
closures to compensate for the mapping drift [46], [47].
Besides, this example shows how humans and robots use
different sensory sources to achieve highly accurate localization
within the scene. Humans use walked steps combined
with eye information for perceiving ego-motion,
which improves localization. For illustration, imagine walking
a distance with closed eyes. The motion drift sooner or
later leads to a loss of localization. Therefore, a series of
interconnected spatial-visual features provides the absolute
localization within the known scene [50].
Equivalent to humans, the latest robotic SLAM approaches
provide similar techniques. Wheel odometry, visual
odometry, or an inertial measurement unit (IMU) provides
the ego-motion. Laser scanners or camera-based
methods simultaneously estimate the pose of landmarks to
compensate for the ego-motion drift. Also, some approaches
use visual landmarks for place redetection to close the trajectory
loop while mapping [46]. The usage of high-level features
enables robots to add semantics to metric information.
Knowing the semantics of an area in space helps robots to
interpret the scene. It allows, e.g., the exclusion of dynamic
objects like people while mapping.
A special challenge of robot perception is the recognition
of scene information from different locations and times. For
instance, a robot recognizes an object in the scene and simultaneously
tracks the object in its field of view (FOV) as long as
it is visible. While previous research proposed using a Kalman
filter [51] or particle filter [52], the latest research utilizes deep
learning-based tracking, such as with a convolutional neural
network (CNN) [53], [54] or a Vision Transformer (ViT)
[55]. ViT, coming from natural language processing (NLP),
splits images into fixed-size patches that gained popularity
due to their superior performance on continuous data
streams, needed for mobile robots [56]. Since the object and
the robot could move, occlusions, truncation, or invisibility
due to sensor noise (as mentioned previously) must be handled.
When the same object appears again, a reidentification
(Re-ID) to reallocate the ID is beneficial to better understand
the scene. Modern approaches of Re-ID have been proposed
that are similar to tracking the usage of a CNN [57], [58], ViT
[59], [60], or end-to-end approaches [61].
Knowledge Representation
The second step of the perception process represents environmental
information. A knowledge base manages recognized
information from different sources and levels of
abstraction, times, and places in a centralized and ordered
structure. This structure includes understandable information
about the scene. The function of the knowledge representation
within the human brain has been a controversial
topic since the so-called gestalt theory. Modern attempts
such as Wagemans and Kimchi [24] or Hommel et al. [62]
reveal spatial layouts, organized hierarchically, that represent
the human perception memory. A relationshipfocused
multilevel hierarchical structure of parts represents
environmental information.
The transfer of the main functionalities of human
knowledge representation to robots requires a complex
memory structure focused on flexibility. The knowledge
representation must be capable of merging observations
and interpretations from different sources and times. For
instance, recognized information, such as shape, texture,
posture, state, probabilities, and trajectory (compare the
" The Recognition of Information " section), must be managed
in real time within the knowledge base. This issue
sets high requirements for the underlying knowledge base
as every piece of environmental information needs a
known structured representation. Furthermore, the
knowledge base includes initial and postprocessed knowledge.
Robots usually store and represent the scene knowledge
in a database [63], [64], [65], allowing them to deploy
DECEMBER 2022 * IEEE ROBOTICS & AUTOMATION MAGAZINE *
39

IEEE Robotics & Automation Magazine - December 2022

Table of Contents for the Digital Edition of IEEE Robotics & Automation Magazine - December 2022

Contents
IEEE Robotics & Automation Magazine - December 2022 - Cover1
IEEE Robotics & Automation Magazine - December 2022 - Cover2
IEEE Robotics & Automation Magazine - December 2022 - Contents
IEEE Robotics & Automation Magazine - December 2022 - 2
IEEE Robotics & Automation Magazine - December 2022 - 3
IEEE Robotics & Automation Magazine - December 2022 - 4
IEEE Robotics & Automation Magazine - December 2022 - 5
IEEE Robotics & Automation Magazine - December 2022 - 6
IEEE Robotics & Automation Magazine - December 2022 - 7
IEEE Robotics & Automation Magazine - December 2022 - 8
IEEE Robotics & Automation Magazine - December 2022 - 9
IEEE Robotics & Automation Magazine - December 2022 - 10
IEEE Robotics & Automation Magazine - December 2022 - 11
IEEE Robotics & Automation Magazine - December 2022 - 12
IEEE Robotics & Automation Magazine - December 2022 - 13
IEEE Robotics & Automation Magazine - December 2022 - 14
IEEE Robotics & Automation Magazine - December 2022 - 15
IEEE Robotics & Automation Magazine - December 2022 - 16
IEEE Robotics & Automation Magazine - December 2022 - 17
IEEE Robotics & Automation Magazine - December 2022 - 18
IEEE Robotics & Automation Magazine - December 2022 - 19
IEEE Robotics & Automation Magazine - December 2022 - 20
IEEE Robotics & Automation Magazine - December 2022 - 21
IEEE Robotics & Automation Magazine - December 2022 - 22
IEEE Robotics & Automation Magazine - December 2022 - 23
IEEE Robotics & Automation Magazine - December 2022 - 24
IEEE Robotics & Automation Magazine - December 2022 - 25
IEEE Robotics & Automation Magazine - December 2022 - 26
IEEE Robotics & Automation Magazine - December 2022 - 27
IEEE Robotics & Automation Magazine - December 2022 - 28
IEEE Robotics & Automation Magazine - December 2022 - 29
IEEE Robotics & Automation Magazine - December 2022 - 30
IEEE Robotics & Automation Magazine - December 2022 - 31
IEEE Robotics & Automation Magazine - December 2022 - 32
IEEE Robotics & Automation Magazine - December 2022 - 33
IEEE Robotics & Automation Magazine - December 2022 - 34
IEEE Robotics & Automation Magazine - December 2022 - 35
IEEE Robotics & Automation Magazine - December 2022 - 36
IEEE Robotics & Automation Magazine - December 2022 - 37
IEEE Robotics & Automation Magazine - December 2022 - 38
IEEE Robotics & Automation Magazine - December 2022 - 39
IEEE Robotics & Automation Magazine - December 2022 - 40
IEEE Robotics & Automation Magazine - December 2022 - 41
IEEE Robotics & Automation Magazine - December 2022 - 42
IEEE Robotics & Automation Magazine - December 2022 - 43
IEEE Robotics & Automation Magazine - December 2022 - 44
IEEE Robotics & Automation Magazine - December 2022 - 45
IEEE Robotics & Automation Magazine - December 2022 - 46
IEEE Robotics & Automation Magazine - December 2022 - 47
IEEE Robotics & Automation Magazine - December 2022 - 48
IEEE Robotics & Automation Magazine - December 2022 - 49
IEEE Robotics & Automation Magazine - December 2022 - 50
IEEE Robotics & Automation Magazine - December 2022 - 51
IEEE Robotics & Automation Magazine - December 2022 - 52
IEEE Robotics & Automation Magazine - December 2022 - 53
IEEE Robotics & Automation Magazine - December 2022 - 54
IEEE Robotics & Automation Magazine - December 2022 - 55
IEEE Robotics & Automation Magazine - December 2022 - 56
IEEE Robotics & Automation Magazine - December 2022 - 57
IEEE Robotics & Automation Magazine - December 2022 - 58
IEEE Robotics & Automation Magazine - December 2022 - 59
IEEE Robotics & Automation Magazine - December 2022 - 60
IEEE Robotics & Automation Magazine - December 2022 - 61
IEEE Robotics & Automation Magazine - December 2022 - 62
IEEE Robotics & Automation Magazine - December 2022 - 63
IEEE Robotics & Automation Magazine - December 2022 - 64
IEEE Robotics & Automation Magazine - December 2022 - 65
IEEE Robotics & Automation Magazine - December 2022 - 66
IEEE Robotics & Automation Magazine - December 2022 - 67
IEEE Robotics & Automation Magazine - December 2022 - 68
IEEE Robotics & Automation Magazine - December 2022 - 69
IEEE Robotics & Automation Magazine - December 2022 - 70
IEEE Robotics & Automation Magazine - December 2022 - 71
IEEE Robotics & Automation Magazine - December 2022 - 72
IEEE Robotics & Automation Magazine - December 2022 - 73
IEEE Robotics & Automation Magazine - December 2022 - 74
IEEE Robotics & Automation Magazine - December 2022 - 75
IEEE Robotics & Automation Magazine - December 2022 - 76
IEEE Robotics & Automation Magazine - December 2022 - 77
IEEE Robotics & Automation Magazine - December 2022 - 78
IEEE Robotics & Automation Magazine - December 2022 - 79
IEEE Robotics & Automation Magazine - December 2022 - 80
IEEE Robotics & Automation Magazine - December 2022 - 81
IEEE Robotics & Automation Magazine - December 2022 - 82
IEEE Robotics & Automation Magazine - December 2022 - 83
IEEE Robotics & Automation Magazine - December 2022 - 84
IEEE Robotics & Automation Magazine - December 2022 - 85
IEEE Robotics & Automation Magazine - December 2022 - 86
IEEE Robotics & Automation Magazine - December 2022 - 87
IEEE Robotics & Automation Magazine - December 2022 - 88
IEEE Robotics & Automation Magazine - December 2022 - 89
IEEE Robotics & Automation Magazine - December 2022 - 90
IEEE Robotics & Automation Magazine - December 2022 - 91
IEEE Robotics & Automation Magazine - December 2022 - 92
IEEE Robotics & Automation Magazine - December 2022 - 93
IEEE Robotics & Automation Magazine - December 2022 - 94
IEEE Robotics & Automation Magazine - December 2022 - 95
IEEE Robotics & Automation Magazine - December 2022 - 96
IEEE Robotics & Automation Magazine - December 2022 - 97
IEEE Robotics & Automation Magazine - December 2022 - 98
IEEE Robotics & Automation Magazine - December 2022 - 99
IEEE Robotics & Automation Magazine - December 2022 - 100
IEEE Robotics & Automation Magazine - December 2022 - 101
IEEE Robotics & Automation Magazine - December 2022 - 102
IEEE Robotics & Automation Magazine - December 2022 - 103
IEEE Robotics & Automation Magazine - December 2022 - 104
IEEE Robotics & Automation Magazine - December 2022 - 105
IEEE Robotics & Automation Magazine - December 2022 - 106
IEEE Robotics & Automation Magazine - December 2022 - 107
IEEE Robotics & Automation Magazine - December 2022 - 108
IEEE Robotics & Automation Magazine - December 2022 - 109
IEEE Robotics & Automation Magazine - December 2022 - 110
IEEE Robotics & Automation Magazine - December 2022 - 111
IEEE Robotics & Automation Magazine - December 2022 - 112
IEEE Robotics & Automation Magazine - December 2022 - 113
IEEE Robotics & Automation Magazine - December 2022 - 114
IEEE Robotics & Automation Magazine - December 2022 - 115
IEEE Robotics & Automation Magazine - December 2022 - 116
IEEE Robotics & Automation Magazine - December 2022 - 117
IEEE Robotics & Automation Magazine - December 2022 - 118
IEEE Robotics & Automation Magazine - December 2022 - 119
IEEE Robotics & Automation Magazine - December 2022 - 120
IEEE Robotics & Automation Magazine - December 2022 - 121
IEEE Robotics & Automation Magazine - December 2022 - 122
IEEE Robotics & Automation Magazine - December 2022 - 123
IEEE Robotics & Automation Magazine - December 2022 - 124
IEEE Robotics & Automation Magazine - December 2022 - 125
IEEE Robotics & Automation Magazine - December 2022 - 126
IEEE Robotics & Automation Magazine - December 2022 - 127
IEEE Robotics & Automation Magazine - December 2022 - 128
IEEE Robotics & Automation Magazine - December 2022 - 129
IEEE Robotics & Automation Magazine - December 2022 - 130
IEEE Robotics & Automation Magazine - December 2022 - 131
IEEE Robotics & Automation Magazine - December 2022 - 132
IEEE Robotics & Automation Magazine - December 2022 - 133
IEEE Robotics & Automation Magazine - December 2022 - 134
IEEE Robotics & Automation Magazine - December 2022 - 135
IEEE Robotics & Automation Magazine - December 2022 - 136
IEEE Robotics & Automation Magazine - December 2022 - 137
IEEE Robotics & Automation Magazine - December 2022 - 138
IEEE Robotics & Automation Magazine - December 2022 - 139
IEEE Robotics & Automation Magazine - December 2022 - 140
IEEE Robotics & Automation Magazine - December 2022 - 141
IEEE Robotics & Automation Magazine - December 2022 - 142
IEEE Robotics & Automation Magazine - December 2022 - 143
IEEE Robotics & Automation Magazine - December 2022 - 144
IEEE Robotics & Automation Magazine - December 2022 - 145
IEEE Robotics & Automation Magazine - December 2022 - 146
IEEE Robotics & Automation Magazine - December 2022 - 147
IEEE Robotics & Automation Magazine - December 2022 - 148
IEEE Robotics & Automation Magazine - December 2022 - 149
IEEE Robotics & Automation Magazine - December 2022 - 150
IEEE Robotics & Automation Magazine - December 2022 - 151
IEEE Robotics & Automation Magazine - December 2022 - 152
IEEE Robotics & Automation Magazine - December 2022 - 153
IEEE Robotics & Automation Magazine - December 2022 - 154
IEEE Robotics & Automation Magazine - December 2022 - 155
IEEE Robotics & Automation Magazine - December 2022 - 156
IEEE Robotics & Automation Magazine - December 2022 - 157
IEEE Robotics & Automation Magazine - December 2022 - 158
IEEE Robotics & Automation Magazine - December 2022 - 159
IEEE Robotics & Automation Magazine - December 2022 - 160
IEEE Robotics & Automation Magazine - December 2022 - 161
IEEE Robotics & Automation Magazine - December 2022 - 162
IEEE Robotics & Automation Magazine - December 2022 - 163
IEEE Robotics & Automation Magazine - December 2022 - 164
IEEE Robotics & Automation Magazine - December 2022 - Cover3
IEEE Robotics & Automation Magazine - December 2022 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2010
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2010
https://www.nxtbookmedia.com