The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing
Mastrantuono E, Burigo M, Rodriguez-Ortiz IR, Saldana D (2019)
JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH 62(6): 1625-1656.
Zeitschriftenaufsatz
| Veröffentlicht | Englisch
Download
Es wurden keine Dateien hochgeladen. Nur Publikationsnachweis!
Autor*in
Mastrantuono, Eliana;
Burigo, MicheleUniBi ;
Rodriguez-Ortiz, Isabel R.;
Saldana, David
Abstract / Bemerkung
Purpose: The use of sign-supported speech (SSS) in the education of deaf students has been recently discussed in relation to its usefulness with deaf children using cochlear implants. To clarify the benefits of SSS for comprehension, 2 eye-tracking experiments aimed to detect the extent to which signs are actively processed in this mode of communication. Method: Participants were 36 deaf adolescents, including cochlear implant users and native deaf signers. Experiment 1 attempted to shift observers' foveal attention to the linguistic source in SSS from which most information is extracted, lip movements or signs, by magnifying the face area, thus modifying lip movements perceptual accessibility (magnified condition), and by constraining the visual field to either the face or the sign through a moving window paradigm (gaze contingent condition). Experiment 2 aimed to explore the reliance on signs in SSS by occasionally producing a mismatch between sign and speech. Participants were required to concentrate upon the orally transmitted message. Results: In Experiment 1, analyses revealed a greater number of fixations toward the signs and a reduction in accuracy in the gaze contingent condition across all participants. Fixations toward signs were also increased in the magnified condition. In Experiment 2, results indicated less accuracy in the mismatching condition across all participants. Participants looked more at the sign when it was inconsistent with speech. Conclusions: All participants, even those with residual hearing, rely on signs when attending SSS, either peripherally or through overt attention, depending on the perceptual conditions.
Erscheinungsjahr
2019
Zeitschriftentitel
JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH
Band
62
Ausgabe
6
Seite(n)
1625-1656
ISSN
1092-4388
eISSN
1558-9102
Page URI
https://pub.uni-bielefeld.de/record/2936331
Zitieren
Mastrantuono E, Burigo M, Rodriguez-Ortiz IR, Saldana D. The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing. JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH. 2019;62(6):1625-1656.
Mastrantuono, E., Burigo, M., Rodriguez-Ortiz, I. R., & Saldana, D. (2019). The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing. JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 62(6), 1625-1656. doi:10.1044/2019_JSLHR-S-17-0433
Mastrantuono, Eliana, Burigo, Michele, Rodriguez-Ortiz, Isabel R., and Saldana, David. 2019. “The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing”. JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH 62 (6): 1625-1656.
Mastrantuono, E., Burigo, M., Rodriguez-Ortiz, I. R., and Saldana, D. (2019). The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing. JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH 62, 1625-1656.
Mastrantuono, E., et al., 2019. The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing. JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 62(6), p 1625-1656.
E. Mastrantuono, et al., “The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing”, JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, vol. 62, 2019, pp. 1625-1656.
Mastrantuono, E., Burigo, M., Rodriguez-Ortiz, I.R., Saldana, D.: The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing. JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH. 62, 1625-1656 (2019).
Mastrantuono, Eliana, Burigo, Michele, Rodriguez-Ortiz, Isabel R., and Saldana, David. “The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing”. JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH 62.6 (2019): 1625-1656.
Daten bereitgestellt von European Bioinformatics Institute (EBI)
Zitationen in Europe PMC
Daten bereitgestellt von Europe PubMed Central.
References
Daten bereitgestellt von Europe PubMed Central.
Export
Markieren/ Markierung löschen
Markierte Publikationen
Web of Science
Dieser Datensatz im Web of Science®Quellen
PMID: 31095442
PubMed | Europe PMC
Suchen in