Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification"

Hermann T, Henning T, Ritter H (2003)
Bielefeld University.

Download
OA
Research Data
Creator
Abstract
This paper presents the gesture desk, a new platform for a human-computer interface at a regular computer workplace. It extends classical input devices like keyboard and mouse by arm and hand gestures, without the need to use any inconvenient accessories like data gloves or markers. A central element is a "gesture box" containing two infrared cameras and a color camera which is positioned under a glass desk. Arm and hand motions are tracked in three dimensions. A synchronizer board has been developed to provide an active glare-free IR-illumination for robust body and hand tracking. As a first application, we demonstrate interactive real-time browsing and querying of auditory self-organizing maps (AuSOMs). An AuSOM is a combined visual and auditory presentation of high-dimensional data sets. Moving the hand above the desk surface allows to select neurons on the map and to manipulate how they contribute to data sonification. Each neuron is associated with a prototype vector in high-dimensional space, so that a set of 2D-topologically ordered feature maps is queried simultaneously. The level of detail is selected by hand altitude over the table surface, allowing to emphasize or deemphasize neurons on the map.
Publishing Year
Data Re-Use License
This Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification" is made available under the Open Database License: http://opendatacommons.org/licenses/odbl/1.0. Any rights in individual contents of the database are licensed under the Database Contents License: http://opendatacommons.org/licenses/dbcl/1.0/
PUB-ID

Cite this

Hermann T, Henning T, Ritter H. Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University; 2003.
Hermann, T., Henning, T., & Ritter, H. (2003). Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University.
Hermann, T., Henning, T., and Ritter, H. (2003). Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University.
Hermann, T., Henning, T., & Ritter, H., 2003. Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification", Bielefeld University.
T. Hermann, T. Henning, and H. Ritter, Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification", Bielefeld University, 2003.
Hermann, T., Henning, T., Ritter, H.: Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University (2003).
Hermann, Thomas, Henning, Thomas, and Ritter, Helge. Supplementary Material for "Gesture Desk -- An Integrated Multi-Modal Interface for Interactive Sonification". Bielefeld University, 2003.
All files available under the following license(s):
Main File(s)
File Name
Description
Section 5 Sound Example for browsing the AuSOM for the Iris data set.

The sonification was rendered by moving the hand slowly over the map, with an constant altitude over the desk surface, thus the aura size is constant. The different rhythmical patterns represent the average feature values averaged over the contributing neurons in the aura.
Access Level
OA Open Access
Last Uploaded
2016-01-26T12:58:06Z

This data publication is cited in the following publications:
1608160
Gesture desk - An integrated multi-modal gestural workplace for sonification
Hermann T, Henning T, Ritter H (2003)
In: Gesture-Based Communication in Human-Computer Interaction, 5th International Gesture Workshop. Selected Revised Papers. Lecture Notes in Computer Science, 2915. Camurri A, Volpe G (Eds);Berlin: Springer: 369-379.
This publication cites the following data publications:

Export

0 Marked Publications

Open Data PUB

Search this title in

Google Scholar