Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence

Maquate K, Kißler J, Knoeferle P (2022)
Language, Cognition and Neuroscience.

Zeitschriftenaufsatz | E-Veröff. vor dem Druck | Englisch
 
Download
Es wurden keine Dateien hochgeladen. Nur Publikationsnachweis!
Autor*in
Maquate, Katja; Kißler, JohannaUniBi; Knoeferle, Pia
Abstract / Bemerkung
We investigated the brain responses associated with the integration of speaker facial emotion into situations in which the speaker verbally describes an emotional event. In two EEG experiments, young adult participants were primed with a happy or sad speaker face. The target consisted of an emotionally positive or negative IAPS photo accompanied by a spoken emotional sentence describing that photo. The speaker's face either matched or mismatched the event-sentence valence. ERPs elicited by the adverb conveying sentence valence showed significantly larger negative mean amplitudes in the EPN and descriptively in the N400 time windows for positive speaker faces - negative event-sentences (vs. negatively matching prime-target trials). Our results suggest that young adults might allocate more processing resources to attend to and process negative (vs. positive) emotional situations when being primed with a positive (vs. negative) speaker face but not vice versa. Post-hoc analysis indicated that this interaction was driven by female participants. We extend previous eye-tracking findings with insights into the timing of the functional brain correlates implicated in integrating the valence of a speaker face into a multi-modal emotional situation.
Stichworte
EPN; N400; emotional face priming; spoken language processing; multi-modal language processing
Erscheinungsjahr
2022
Zeitschriftentitel
Language, Cognition and Neuroscience
ISSN
2327-3798
eISSN
2327-3801
Page URI
https://pub.uni-bielefeld.de/record/2965784

Zitieren

Maquate K, Kißler J, Knoeferle P. Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence. Language, Cognition and Neuroscience. 2022.
Maquate, K., Kißler, J., & Knoeferle, P. (2022). Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence. Language, Cognition and Neuroscience. https://doi.org/10.1080/23273798.2022.2108089
Maquate, Katja, Kißler, Johanna, and Knoeferle, Pia. 2022. “Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence”. Language, Cognition and Neuroscience.
Maquate, K., Kißler, J., and Knoeferle, P. (2022). Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence. Language, Cognition and Neuroscience.
Maquate, K., Kißler, J., & Knoeferle, P., 2022. Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence. Language, Cognition and Neuroscience.
K. Maquate, J. Kißler, and P. Knoeferle, “Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence”, Language, Cognition and Neuroscience, 2022.
Maquate, K., Kißler, J., Knoeferle, P.: Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence. Language, Cognition and Neuroscience. (2022).
Maquate, Katja, Kißler, Johanna, and Knoeferle, Pia. “Speakers' emotional facial expressions modulate subsequent multi-modal language processing: ERP evidence”. Language, Cognition and Neuroscience (2022).
Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Web of Science

Dieser Datensatz im Web of Science®
Suchen in

Google Scholar