Towards gestural understanding for intelligent robots

Fritsch JN (2012)
Bielefeld: Universität Bielefeld.

Bielefelder E-Habilitation | Englisch
 
Download
OA
Autor*in
Fritsch, Jan Nikolaus
Gutachter*in / Betreuer*in
Kummert, Franz
Abstract / Bemerkung
A strong driving force of scientific progress in the technical sciences is the quest for systems that assist humans in their daily life and make their life easier and more enjoyable. Nowadays smartphones are probably the most typical instances of such systems. Another class of systems that is getting increasing attention are intelligent robots. Instead of offering a smartphone touch screen to select actions, these systems are intended to offer a more natural human-machine interface to their users. Out of the large range of actions performed by humans, gestures performed with the hands play a very important role especially when humans interact with their direct surrounding like, e.g., pointing to an object or manipulating it. Consequently, a robot has to understand such gestures to offer an intuitive interface. Gestural understanding is, therefore, a key capability on the way to intelligent robots. This book deals with vision-based approaches for gestural understanding. Over the past two decades, this has been an intensive field of research which has resulted in a variety of algorithms to analyze human hand motions. Following a categorization of different gesture types and a review of other sensing techniques, the design of vision systems that achieve hand gesture understanding for intelligent robots is analyzed. For each of the individual algorithmic steps – hand detection, hand tracking, and trajectory-based gesture recognition – a separate Chapter introduces common techniques and algorithms and provides example methods. The resulting recognition algorithms are considering gestures in isolation and are often not sufficient for interacting with a robot who can only understand such gestures when incorporating the context like, e.g., what object was pointed at or manipulated. Going beyond a purely trajectory-based gesture recognition by incorporating context is an important prerequisite to achieve gesture understanding and is addressed explicitly in a separate Chapter of this book. Two types of context, user-provided context and situational context, are reviewed and existing approaches to incorporate context for gestural understanding are reviewed. Example approaches for both context types provide a deeper algorithmic insight into this field of research. An overview of recent robots capable of gesture recognition and understanding summarizes the currently realized human-robot interaction quality. The approaches for gesture understanding covered in this book are manually designed while humans learn to recognize gestures automatically during growing up. Promising research targeted at analyzing developmental learning in children in order to mimic this capability in technical systems is highlighted in the last Chapter completing this book as this research direction may be highly influential for creating future gesture understanding systems.
Jahr
2012
Seite(n)
232
Page URI
https://pub.uni-bielefeld.de/record/2515054

Zitieren

Fritsch JN. Towards gestural understanding for intelligent robots. Bielefeld: Universität Bielefeld; 2012.
Fritsch, J. N. (2012). Towards gestural understanding for intelligent robots. Bielefeld: Universität Bielefeld.
Fritsch, Jan Nikolaus. 2012. Towards gestural understanding for intelligent robots. Bielefeld: Universität Bielefeld.
Fritsch, J. N. (2012). Towards gestural understanding for intelligent robots. Bielefeld: Universität Bielefeld.
Fritsch, J.N., 2012. Towards gestural understanding for intelligent robots, Bielefeld: Universität Bielefeld.
J.N. Fritsch, Towards gestural understanding for intelligent robots, Bielefeld: Universität Bielefeld, 2012.
Fritsch, J.N.: Towards gestural understanding for intelligent robots. Universität Bielefeld, Bielefeld (2012).
Fritsch, Jan Nikolaus. Towards gestural understanding for intelligent robots. Bielefeld: Universität Bielefeld, 2012.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Copyright Statement:
Dieses Objekt ist durch das Urheberrecht und/oder verwandte Schutzrechte geschützt. [...]
Volltext(e)
Access Level
OA Open Access
Zuletzt Hochgeladen
2019-09-25T06:58:16Z
MD5 Prüfsumme
ce7bd594c626f0923ddd2efe36cd1364


Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Suchen in

Google Scholar