IEEE Robotics & Automation Magazine - December 2022 - 40

a human-understandable ontology that conceptualizes multiple
entities within a domain and their relationships [66].
The scene representation comprises various perceptionrelated
information, such as extracted spatial features for
navigation, manipulation, and semantically enhanced maps
[67]. The requirements for robotic knowledge representation
are high. Ideally, it must be real time capable; generic;
scalable; flexible in structure; and able to update and extend
during the robot's lifetime as well as easy to connect for
access and data sharing. Generally, there are two categories
of databases: graph based and document based. Both seem
suitable for this task as they provide comprehensive features
to cover these requirements [68], [69]. Graph-based databases,
also known as relational database management systems,
represent knowledge through relations.
Herewith, it is necessary to explicitly define relations
through a common format to link semantics through an
ontology [70]. In contrast to relational databases, documentbased
databases represent data in JavaScript Object Notation-like
documents without the need for relations or
predefined structures. Nevertheless, these databases provide
features for querying or indexing the data to model dependencies
and relations implicitly dynamically. For instance,
Kunze et al. [63] propose spatial and temporal indexing for
query relations within their document-based database.
Besides, linking the knowledge base for decision making [71]
and providing the represented knowledge to robot actions,
such as manipulation or moving, enables reactive behaviors.
The comparison of scene knowledge representation indicates
that robots have advantages compared to humans.
First, the artificial scene knowledge representation has no
memory loss due to an almost unlimited storage capacity.
Second, robots can easily share and make use of foreign perception,
while humans have to transfer knowledge into an
appropriate modality, such as verbal communication. Additionally,
humans are limited in the range of information
exchange without technical assistance. In contrast, robots
can share data in their original format over networks. The
sharing of perceptional information enables robots to
directly exchange data with the infrastructure or other
robots. Another difference between human and robot
knowledge representation is the capability of robots to start
with a preinstalled environment perception model. Robots
can use the prior information of a building information
model [72] or a partial or fully premapped environment
[64]. Using prior scene knowledge reduces the setup time,
especially when using multiple agents.
Knowledge Interpretation
Based on the available scene knowledge, this perception step
interprets existing knowledge to make sense using cognitive
capabilities. It has been proven that humans interpret the
scene; however, it is unclear how this is executed within the
brain. The research by Isik et al. [73] found that the human
brain starts recognizing view-invariant observations, such as
human actions, quickly, in around 200 ms. This suggests that
40 * IEEE ROBOTICS & AUTOMATION MAGAZINE * DECEMBER 2022
the brain uses the form, as well as the motions, to represent
states. Furthermore, previous work [74] proposes that the
human brain benefits from causal relations, such as temporal
continuity and spatial relations, among objects, humans, and
their actions. The prerequisite is that a known structure represents
the knowledge (see the " Knowledge Representation "
section). A high-level scene analysis encodes spatial and temporal
relations between instances [75].
For most examples, the interpretation of scene knowledge
is a trivial task for humans due to lifelong learning. The interpretation
of perceived information has been trained with preknowledge,
dependent on the culture [76, Ch. 14]; context;
and situation itself [77]. Therefore, all people have a unique
perception system based on and enriched by their environment.
Thus, human interpretation is neither predictable nor
always the same. The famous duck-rabbit illusion [78] indicates
that even the season could differ the interpretation of
the scene. Therefore, perception is influenced by environmental
factors [77].
For robots, the interpretation of the scene is still an
unsolved problem [79]. The challenge is to generate and make
use of high-level semantic knowledge to reason about the
present scene. There is no commonsense model that could be
applied to every environment without the adaptation of the
interpretations. The association of scene knowledge across
multiple dimensions, such as time and space, with or without
relational dependencies between single pieces of information,
leads to this new knowledge that improves scene understanding.
For instance, a spatial-temporal scene analysis could
reveal daily habits, such as when, how, and how often we go to
the kitchen to fetch a coffee. This example shows that the kind
of required high-level information is environment specific as
well as use case specific. The goal is to understand what is
involved and what to do when and with which objects. The
use of high-level semantics within the scene is fundamental
for complex robot behavior tasks. Herewith, we identify two
types of interpretations.
On the one hand, there is research aimed at reconstructing
and interpreting the structured part of the scene, such as
room segmentation, junction detection [32], or occlusion
reasoning for simple shapes [80]. On the other hand, there is
research focusing on the unstructured part that goes deeper
into handling dynamics. The approach presented in [81] is
one of the rare examples in which already collected scene
information is used to gather new information. Observations
of objects are anchored in the scene model, which provides
basic tracking functionality. In combination with
knowledge about the whole scene, including other objects
and their spatial and semantic relations, this is used for reasoning
about the state of occluded objects, which improves
tracking and hence the whole scene state. The tracking of
instances over multiple observations enables further interpretations,
such as detecting their action [82]. The actions of
the people together with environmental semantics are valuable
input for a robot since they usually share an environment.
So, they need to understand the actions and fulfill

IEEE Robotics & Automation Magazine - December 2022

Table of Contents for the Digital Edition of IEEE Robotics & Automation Magazine - December 2022

Contents
IEEE Robotics & Automation Magazine - December 2022 - Cover1
IEEE Robotics & Automation Magazine - December 2022 - Cover2
IEEE Robotics & Automation Magazine - December 2022 - Contents
IEEE Robotics & Automation Magazine - December 2022 - 2
IEEE Robotics & Automation Magazine - December 2022 - 3
IEEE Robotics & Automation Magazine - December 2022 - 4
IEEE Robotics & Automation Magazine - December 2022 - 5
IEEE Robotics & Automation Magazine - December 2022 - 6
IEEE Robotics & Automation Magazine - December 2022 - 7
IEEE Robotics & Automation Magazine - December 2022 - 8
IEEE Robotics & Automation Magazine - December 2022 - 9
IEEE Robotics & Automation Magazine - December 2022 - 10
IEEE Robotics & Automation Magazine - December 2022 - 11
IEEE Robotics & Automation Magazine - December 2022 - 12
IEEE Robotics & Automation Magazine - December 2022 - 13
IEEE Robotics & Automation Magazine - December 2022 - 14
IEEE Robotics & Automation Magazine - December 2022 - 15
IEEE Robotics & Automation Magazine - December 2022 - 16
IEEE Robotics & Automation Magazine - December 2022 - 17
IEEE Robotics & Automation Magazine - December 2022 - 18
IEEE Robotics & Automation Magazine - December 2022 - 19
IEEE Robotics & Automation Magazine - December 2022 - 20
IEEE Robotics & Automation Magazine - December 2022 - 21
IEEE Robotics & Automation Magazine - December 2022 - 22
IEEE Robotics & Automation Magazine - December 2022 - 23
IEEE Robotics & Automation Magazine - December 2022 - 24
IEEE Robotics & Automation Magazine - December 2022 - 25
IEEE Robotics & Automation Magazine - December 2022 - 26
IEEE Robotics & Automation Magazine - December 2022 - 27
IEEE Robotics & Automation Magazine - December 2022 - 28
IEEE Robotics & Automation Magazine - December 2022 - 29
IEEE Robotics & Automation Magazine - December 2022 - 30
IEEE Robotics & Automation Magazine - December 2022 - 31
IEEE Robotics & Automation Magazine - December 2022 - 32
IEEE Robotics & Automation Magazine - December 2022 - 33
IEEE Robotics & Automation Magazine - December 2022 - 34
IEEE Robotics & Automation Magazine - December 2022 - 35
IEEE Robotics & Automation Magazine - December 2022 - 36
IEEE Robotics & Automation Magazine - December 2022 - 37
IEEE Robotics & Automation Magazine - December 2022 - 38
IEEE Robotics & Automation Magazine - December 2022 - 39
IEEE Robotics & Automation Magazine - December 2022 - 40
IEEE Robotics & Automation Magazine - December 2022 - 41
IEEE Robotics & Automation Magazine - December 2022 - 42
IEEE Robotics & Automation Magazine - December 2022 - 43
IEEE Robotics & Automation Magazine - December 2022 - 44
IEEE Robotics & Automation Magazine - December 2022 - 45
IEEE Robotics & Automation Magazine - December 2022 - 46
IEEE Robotics & Automation Magazine - December 2022 - 47
IEEE Robotics & Automation Magazine - December 2022 - 48
IEEE Robotics & Automation Magazine - December 2022 - 49
IEEE Robotics & Automation Magazine - December 2022 - 50
IEEE Robotics & Automation Magazine - December 2022 - 51
IEEE Robotics & Automation Magazine - December 2022 - 52
IEEE Robotics & Automation Magazine - December 2022 - 53
IEEE Robotics & Automation Magazine - December 2022 - 54
IEEE Robotics & Automation Magazine - December 2022 - 55
IEEE Robotics & Automation Magazine - December 2022 - 56
IEEE Robotics & Automation Magazine - December 2022 - 57
IEEE Robotics & Automation Magazine - December 2022 - 58
IEEE Robotics & Automation Magazine - December 2022 - 59
IEEE Robotics & Automation Magazine - December 2022 - 60
IEEE Robotics & Automation Magazine - December 2022 - 61
IEEE Robotics & Automation Magazine - December 2022 - 62
IEEE Robotics & Automation Magazine - December 2022 - 63
IEEE Robotics & Automation Magazine - December 2022 - 64
IEEE Robotics & Automation Magazine - December 2022 - 65
IEEE Robotics & Automation Magazine - December 2022 - 66
IEEE Robotics & Automation Magazine - December 2022 - 67
IEEE Robotics & Automation Magazine - December 2022 - 68
IEEE Robotics & Automation Magazine - December 2022 - 69
IEEE Robotics & Automation Magazine - December 2022 - 70
IEEE Robotics & Automation Magazine - December 2022 - 71
IEEE Robotics & Automation Magazine - December 2022 - 72
IEEE Robotics & Automation Magazine - December 2022 - 73
IEEE Robotics & Automation Magazine - December 2022 - 74
IEEE Robotics & Automation Magazine - December 2022 - 75
IEEE Robotics & Automation Magazine - December 2022 - 76
IEEE Robotics & Automation Magazine - December 2022 - 77
IEEE Robotics & Automation Magazine - December 2022 - 78
IEEE Robotics & Automation Magazine - December 2022 - 79
IEEE Robotics & Automation Magazine - December 2022 - 80
IEEE Robotics & Automation Magazine - December 2022 - 81
IEEE Robotics & Automation Magazine - December 2022 - 82
IEEE Robotics & Automation Magazine - December 2022 - 83
IEEE Robotics & Automation Magazine - December 2022 - 84
IEEE Robotics & Automation Magazine - December 2022 - 85
IEEE Robotics & Automation Magazine - December 2022 - 86
IEEE Robotics & Automation Magazine - December 2022 - 87
IEEE Robotics & Automation Magazine - December 2022 - 88
IEEE Robotics & Automation Magazine - December 2022 - 89
IEEE Robotics & Automation Magazine - December 2022 - 90
IEEE Robotics & Automation Magazine - December 2022 - 91
IEEE Robotics & Automation Magazine - December 2022 - 92
IEEE Robotics & Automation Magazine - December 2022 - 93
IEEE Robotics & Automation Magazine - December 2022 - 94
IEEE Robotics & Automation Magazine - December 2022 - 95
IEEE Robotics & Automation Magazine - December 2022 - 96
IEEE Robotics & Automation Magazine - December 2022 - 97
IEEE Robotics & Automation Magazine - December 2022 - 98
IEEE Robotics & Automation Magazine - December 2022 - 99
IEEE Robotics & Automation Magazine - December 2022 - 100
IEEE Robotics & Automation Magazine - December 2022 - 101
IEEE Robotics & Automation Magazine - December 2022 - 102
IEEE Robotics & Automation Magazine - December 2022 - 103
IEEE Robotics & Automation Magazine - December 2022 - 104
IEEE Robotics & Automation Magazine - December 2022 - 105
IEEE Robotics & Automation Magazine - December 2022 - 106
IEEE Robotics & Automation Magazine - December 2022 - 107
IEEE Robotics & Automation Magazine - December 2022 - 108
IEEE Robotics & Automation Magazine - December 2022 - 109
IEEE Robotics & Automation Magazine - December 2022 - 110
IEEE Robotics & Automation Magazine - December 2022 - 111
IEEE Robotics & Automation Magazine - December 2022 - 112
IEEE Robotics & Automation Magazine - December 2022 - 113
IEEE Robotics & Automation Magazine - December 2022 - 114
IEEE Robotics & Automation Magazine - December 2022 - 115
IEEE Robotics & Automation Magazine - December 2022 - 116
IEEE Robotics & Automation Magazine - December 2022 - 117
IEEE Robotics & Automation Magazine - December 2022 - 118
IEEE Robotics & Automation Magazine - December 2022 - 119
IEEE Robotics & Automation Magazine - December 2022 - 120
IEEE Robotics & Automation Magazine - December 2022 - 121
IEEE Robotics & Automation Magazine - December 2022 - 122
IEEE Robotics & Automation Magazine - December 2022 - 123
IEEE Robotics & Automation Magazine - December 2022 - 124
IEEE Robotics & Automation Magazine - December 2022 - 125
IEEE Robotics & Automation Magazine - December 2022 - 126
IEEE Robotics & Automation Magazine - December 2022 - 127
IEEE Robotics & Automation Magazine - December 2022 - 128
IEEE Robotics & Automation Magazine - December 2022 - 129
IEEE Robotics & Automation Magazine - December 2022 - 130
IEEE Robotics & Automation Magazine - December 2022 - 131
IEEE Robotics & Automation Magazine - December 2022 - 132
IEEE Robotics & Automation Magazine - December 2022 - 133
IEEE Robotics & Automation Magazine - December 2022 - 134
IEEE Robotics & Automation Magazine - December 2022 - 135
IEEE Robotics & Automation Magazine - December 2022 - 136
IEEE Robotics & Automation Magazine - December 2022 - 137
IEEE Robotics & Automation Magazine - December 2022 - 138
IEEE Robotics & Automation Magazine - December 2022 - 139
IEEE Robotics & Automation Magazine - December 2022 - 140
IEEE Robotics & Automation Magazine - December 2022 - 141
IEEE Robotics & Automation Magazine - December 2022 - 142
IEEE Robotics & Automation Magazine - December 2022 - 143
IEEE Robotics & Automation Magazine - December 2022 - 144
IEEE Robotics & Automation Magazine - December 2022 - 145
IEEE Robotics & Automation Magazine - December 2022 - 146
IEEE Robotics & Automation Magazine - December 2022 - 147
IEEE Robotics & Automation Magazine - December 2022 - 148
IEEE Robotics & Automation Magazine - December 2022 - 149
IEEE Robotics & Automation Magazine - December 2022 - 150
IEEE Robotics & Automation Magazine - December 2022 - 151
IEEE Robotics & Automation Magazine - December 2022 - 152
IEEE Robotics & Automation Magazine - December 2022 - 153
IEEE Robotics & Automation Magazine - December 2022 - 154
IEEE Robotics & Automation Magazine - December 2022 - 155
IEEE Robotics & Automation Magazine - December 2022 - 156
IEEE Robotics & Automation Magazine - December 2022 - 157
IEEE Robotics & Automation Magazine - December 2022 - 158
IEEE Robotics & Automation Magazine - December 2022 - 159
IEEE Robotics & Automation Magazine - December 2022 - 160
IEEE Robotics & Automation Magazine - December 2022 - 161
IEEE Robotics & Automation Magazine - December 2022 - 162
IEEE Robotics & Automation Magazine - December 2022 - 163
IEEE Robotics & Automation Magazine - December 2022 - 164
IEEE Robotics & Automation Magazine - December 2022 - Cover3
IEEE Robotics & Automation Magazine - December 2022 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2023
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2022
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2021
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2020
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2019
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2018
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2017
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2016
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2015
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2014
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2013
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2012
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_june2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_march2011
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_december2010
https://www.nxtbook.com/nxtbooks/ieee/roboticsautomation_september2010
https://www.nxtbookmedia.com