MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading

Bröhl F, Keitel A, Kayser C (2022)
eneuro: ENEURO.0209-22.2022.

Zeitschriftenaufsatz | Veröffentlicht | Englisch
 
Download
Es wurden keine Dateien hochgeladen. Nur Publikationsnachweis!
Autor*in
Abstract / Bemerkung
Speech is an intrinsically multisensory signal and seeing the speaker's lips forms a cornerstone of communication in acoustically impoverished environments. Still, it remains unclear how the brain exploits visual speech for comprehension. Previous work debated whether lip signals are mainly processed along the auditory pathways or whether the visual system directly implements speech-related processes. To probe this, we systematically characterized dynamic representations of multiple acoustic and visual speech-derived features in source localized MEG recordings that were obtained while participants listened to speech or viewed silent speech. Using a mutual-information framework we provide a comprehensive assessment of how well temporal and occipital cortices reflect the physically presented signals and unique aspects of acoustic features that were physically absent but may be critical for comprehension. Our results demonstrate that both cortices feature a functionally specific form of multisensory restoration: during lip reading they reflect unheard acoustic features, independent of co-existing representations of the visible lip movements. This restoration emphasizes the unheard pitch signature in occipital cortex and the speech envelope in temporal cortex and is predictive of lip reading performance. These findings suggest that when seeing the speaker's lips, the brain engages both visual and auditory pathways to support comprehension by exploiting multisensory correspondences between lip movements and spectro-temporal acoustic cues.
Erscheinungsjahr
2022
Zeitschriftentitel
eneuro
Art.-Nr.
ENEURO.0209-22.2022
eISSN
2373-2822
Page URI
https://pub.uni-bielefeld.de/record/2963999

Zitieren

Bröhl F, Keitel A, Kayser C. MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading. eneuro. 2022: ENEURO.0209-22.2022.
Bröhl, F., Keitel, A., & Kayser, C. (2022). MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading. eneuro, ENEURO.0209-22.2022. https://doi.org/10.1523/ENEURO.0209-22.2022
Bröhl, Felix, Keitel, Anne, and Kayser, Christoph. 2022. “MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading”. eneuro: ENEURO.0209-22.2022.
Bröhl, F., Keitel, A., and Kayser, C. (2022). MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading. eneuro:ENEURO.0209-22.2022.
Bröhl, F., Keitel, A., & Kayser, C., 2022. MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading. eneuro, : ENEURO.0209-22.2022.
F. Bröhl, A. Keitel, and C. Kayser, “MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading”, eneuro, 2022, : ENEURO.0209-22.2022.
Bröhl, F., Keitel, A., Kayser, C.: MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading. eneuro. : ENEURO.0209-22.2022 (2022).
Bröhl, Felix, Keitel, Anne, and Kayser, Christoph. “MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading”. eneuro (2022): ENEURO.0209-22.2022.
Material in PUB:
Dissertation, die diesen PUB Eintrag enthält
Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Web of Science

Dieser Datensatz im Web of Science®
Quellen

PMID: 35728955
PubMed | Europe PMC

Suchen in

Google Scholar