Supplementary Material for "Real-Time Control of Sonification Models with an Audio-Haptic Interface"

Hermann T, Krause J, Ritter H (2002)
Bielefeld University.

Download
OA
OA nneurons_15.wav
OA nneurons_10.wav
All
Research Data
Creator
Abstract
This paper presents a new interface for controlling sonification models. A haptic controller interface is developed which allows both to manipulate a sonification model, e.g. by interacting with it and to provide a haptic data representation. A variety of input types are supported with a hand-sized interface, including shaking, squeezing, hammering, moving, rotating and accelerating. The paper presents details on the interface under development and demonstrates application of the device for controlling a sonification model. For this purpose, the Data-Solid Sonification Model is introduced, which provides an acoustic representation of the local neighborhood relations in high-dimensional datasets for binary classification problems. The model is parameterized by a reduced data representation obtained from a growing neural gas network. Sound examples are given to demonstrate the device and the sonification model. ## Sound Demonstrations for the Audio-Haptic Ball Interface * #### Sounds for atomic collisions: Each model mass is assigned a material according to the data within the neurons voronoi cell. In our situation, binary classification data is used and the object type is either A (or B) if data from class A (or B) dominates the cell. + Collision between a A object with A object: [sound](https://pub.uni-bielefeld.de/download/2704140/2704141) (wood) + Collision between a B object and a B object: [sound](https://pub.uni-bielefeld.de/download/2704140/2704142) (plastic) + Collision between a A object and a B object: [sound](https://pub.uni-bielefeld.de/download/2704140/2704143) (glass) * #### Table 1: Sound examples for synthetic datasets from binary classification problems
File/Track:
  • A) two separated classes -- sound
  • B) two classes showing little overlap along one axis  -- sound
  • C) two classes that overlap - inseparable classes -- sound
Description: The sonification model was excited by shaking the interface device. The shaking activation is given by the acceleration a_x(t), a_y(t) . During the first half of the sound examples, the interface is shaken along the x axis, during the second half along the y axis.
Duration: about 5 sec.
* #### Table 2: Sound examples for Data solid sonifications using a GNG for the dataset (B) (see above) for different network complexities.
File/Track:
Description: Shaking excitation of the dataset with increasing network complexity.
Duration: about 2 sec / example
* Sound Example for shaking while GNG adaptation proceeds, and thus while the data-solid structure changes over time: [sound example](https://pub.uni-bielefeld.de/download/2704140/2704153) . It can be heard that with ongoing GNG growth more and more neurons exist (more collisions). From the pitch it can be perceived that the number of data points that have a neuron as its nearest neighbor decreases - new neurons are added between neurons that are frequently activated.
Publishing Year
Data Re-Use License
This Supplementary Material for "Real-Time Control of Sonification Models with an Audio-Haptic Interface" is made available under the Open Database License: http://opendatacommons.org/licenses/odbl/1.0. Any rights in individual contents of the database are licensed under the Database Contents License: http://opendatacommons.org/licenses/dbcl/1.0/
PUB-ID

Cite this

Hermann T, Krause J, Ritter H. Supplementary Material for "Real-Time Control of Sonification Models with an Audio-Haptic Interface". Bielefeld University; 2002.
Hermann, T., Krause, J., & Ritter, H. (2002). Supplementary Material for "Real-Time Control of Sonification Models with an Audio-Haptic Interface". Bielefeld University.
Hermann, T., Krause, J., and Ritter, H. (2002). Supplementary Material for "Real-Time Control of Sonification Models with an Audio-Haptic Interface". Bielefeld University.
Hermann, T., Krause, J., & Ritter, H., 2002. Supplementary Material for "Real-Time Control of Sonification Models with an Audio-Haptic Interface", Bielefeld University.
T. Hermann, J. Krause, and H. Ritter, Supplementary Material for "Real-Time Control of Sonification Models with an Audio-Haptic Interface", Bielefeld University, 2002.
Hermann, T., Krause, J., Ritter, H.: Supplementary Material for "Real-Time Control of Sonification Models with an Audio-Haptic Interface". Bielefeld University (2002).
Hermann, Thomas, Krause, Jan, and Ritter, Helge. Supplementary Material for "Real-Time Control of Sonification Models with an Audio-Haptic Interface". Bielefeld University, 2002.
All files available under the following license(s):
Main File(s)
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
File Name
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
File Name
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
File Name
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
File Name
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
File Name
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
File Name
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z
Access Level
OA Open Access
Last Uploaded
2016-01-26T11:25:33Z

This data publication is cited in the following publications:
2017351
Real-Time Control of Sonification Models with an Audio-Haptic Interface
Hermann T, Krause J, Ritter H (2002)
In: Proceedings of the International Conference on Auditory Display. Nakatsu R, Kawahara H (Eds);Kyoto, Japan: ICAD: 82-86.
This publication cites the following data publications:

Export

0 Marked Publications

Open Data PUB

Search this title in

Google Scholar