Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications

Lücking A, Bergmann K, Hahn F, Kopp S, Rieser H (2013)
Journal on Multimodal User Interfaces 7(1-2): 5-18.

Zeitschriftenaufsatz | Veröffentlicht | Englisch
 
Download
OA
Abstract / Bemerkung
Communicating face-to-face, interlocutors frequently produce multimodal meaning packages consisting of speech and accompanying gestures. We discuss a systematically annotated speech and gesture corpus consisting of 25 route-and-landmark-description dialogues, the Bielefeld Speech and Gesture Alignment corpus (SaGA), collected in experimental face-to-face settings. We first describe the primary and secondary data of the corpus and its reliability assessment. Then we go into some of the projects carried out using SaGA demonstrating the wide range of its usability: on the empirical side, there is work on gesture typology, individual and contextual parameters influencing gesture production and gestures’ functions for dialogue structure. Speech-gesture interfaces have been established extending unification-based grammars. In addition, the development of a computational model of speech-gesture alignment and its implementation constitutes a research line we focus on.
Stichworte
Multimodal dialogue; Iconic gesture; Multimodal simulation; Multimodal data; Speech-and-gesture alignment
Erscheinungsjahr
2013
Zeitschriftentitel
Journal on Multimodal User Interfaces
Band
7
Ausgabe
1-2
Seite(n)
5-18
ISSN
1783-7677
eISSN
1783-8738
Page URI
https://pub.uni-bielefeld.de/record/2522299

Zitieren

Lücking A, Bergmann K, Hahn F, Kopp S, Rieser H. Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications. Journal on Multimodal User Interfaces. 2013;7(1-2):5-18.
Lücking, A., Bergmann, K., Hahn, F., Kopp, S., & Rieser, H. (2013). Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications. Journal on Multimodal User Interfaces, 7(1-2), 5-18. doi:10.1007/s12193-012-0106-8
Lücking, A., Bergmann, K., Hahn, F., Kopp, S., and Rieser, H. (2013). Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications. Journal on Multimodal User Interfaces 7, 5-18.
Lücking, A., et al., 2013. Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications. Journal on Multimodal User Interfaces, 7(1-2), p 5-18.
A. Lücking, et al., “Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications”, Journal on Multimodal User Interfaces, vol. 7, 2013, pp. 5-18.
Lücking, A., Bergmann, K., Hahn, F., Kopp, S., Rieser, H.: Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications. Journal on Multimodal User Interfaces. 7, 5-18 (2013).
Lücking, Andy, Bergmann, Kirsten, Hahn, Florian, Kopp, Stefan, and Rieser, Hannes. “Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications”. Journal on Multimodal User Interfaces 7.1-2 (2013): 5-18.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Volltext(e)
Access Level
OA Open Access
Zuletzt Hochgeladen
2019-09-06T09:18:05Z
MD5 Prüfsumme
9e0242d242447dfb907f3a72fab11958