Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification"

Hermann T, Henning T, Ritter H (2003)
Bielefeld University.

Datenpublikation
 
Download
OA
Creator
Abstract / Bemerkung
This paper presents the gesture desk, a new platform for a human-computer interface at a regular computer workplace. It extends classical input devices like keyboard and mouse by arm and hand gestures, without the need to use any inconvenient accessories like data gloves or markers. A central element is a "gesture box" containing two infrared cameras and a color camera which is positioned under a glass desk. Arm and hand motions are tracked in three dimensions. A synchronizer board has been developed to provide an active glare-free IR-illumination for robust body and hand tracking. As a first application, we demonstrate interactive real-time browsing and querying of auditory self-organizing maps (AuSOMs). An AuSOM is a combined visual and auditory presentation of high-dimensional data sets. Moving the hand above the desk surface allows to select neurons on the map and to manipulate how they contribute to data sonification. Each neuron is associated with a prototype vector in high-dimensional space, so that a set of 2D-topologically ordered feature maps is queried simultaneously. The level of detail is selected by hand altitude over the table surface, allowing to emphasize or deemphasize neurons on the map.
Erscheinungsjahr
2003
Copyright und Lizenzen
Page URI
https://pub.uni-bielefeld.de/record/2700998

Zitieren

Hermann T, Henning T, Ritter H. Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University; 2003.
Hermann, T., Henning, T., & Ritter, H. (2003). Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University. doi:10.4119/unibi/2700998
Hermann, Thomas, Henning, Thomas, and Ritter, Helge. 2003. Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University.
Hermann, T., Henning, T., and Ritter, H. (2003). Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University.
Hermann, T., Henning, T., & Ritter, H., 2003. Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification", Bielefeld University.
T. Hermann, T. Henning, and H. Ritter, Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification", Bielefeld University, 2003.
Hermann, T., Henning, T., Ritter, H.: Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University (2003).
Hermann, Thomas, Henning, Thomas, and Ritter, Helge. Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University, 2003.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Volltext(e)
Beschreibung
Section 5 Sound Example for browsing the AuSOM for the Iris data set.

The sonification was rendered by moving the hand slowly over the map, with an constant altitude over the desk surface, thus the aura size is constant. The different rhythmical patterns represent the average feature values averaged over the contributing neurons in the aura.
Access Level
OA Open Access
Zuletzt Hochgeladen
2019-09-25T06:34:45Z
MD5 Prüfsumme
41f9c00cb6d771576425e45225af736d


Material in PUB:
Wird zitiert von
Gesture desk - An integrated multi-modal gestural workplace for sonification
Hermann T, Henning T, Ritter H (2003)
In: Gesture-Based Communication in Human-Computer Interaction, 5th International Gesture Workshop. Selected Revised Papers. Lecture Notes in Computer Science, 2915. Camurri A, Volpe G (Eds); Berlin: Springer: 369-379.
Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Suchen in

Google Scholar