Mapping the emotional face. How individual face parts contribute to successful emotion recognition

Wegrzyn M, Vogt M, Kireclioglu B, Schneider J, Kißler J (2017)
PLOS ONE 12(5): e0177239.

Download
OA 17.33 MB
Journal Article | Original Article | Published | English
Author
; ; ; ;
Abstract
Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.
Publishing Year
ISSN
Financial disclosure
Article Processing Charge funded by the Deutsche Forschungsgemeinschaft and the Open Access Publication Fund of Bielefeld University.
PUB-ID

Cite this

Wegrzyn M, Vogt M, Kireclioglu B, Schneider J, Kißler J. Mapping the emotional face. How individual face parts contribute to successful emotion recognition. PLOS ONE. 2017;12(5): e0177239.
Wegrzyn, M., Vogt, M., Kireclioglu, B., Schneider, J., & Kißler, J. (2017). Mapping the emotional face. How individual face parts contribute to successful emotion recognition. PLOS ONE, 12(5), e0177239. doi:10.1371/journal.pone.0177239
Wegrzyn, M., Vogt, M., Kireclioglu, B., Schneider, J., and Kißler, J. (2017). Mapping the emotional face. How individual face parts contribute to successful emotion recognition. PLOS ONE 12:e0177239.
Wegrzyn, M., et al., 2017. Mapping the emotional face. How individual face parts contribute to successful emotion recognition. PLOS ONE, 12(5): e0177239.
M. Wegrzyn, et al., “Mapping the emotional face. How individual face parts contribute to successful emotion recognition”, PLOS ONE, vol. 12, 2017, : e0177239.
Wegrzyn, M., Vogt, M., Kireclioglu, B., Schneider, J., Kißler, J.: Mapping the emotional face. How individual face parts contribute to successful emotion recognition. PLOS ONE. 12, : e0177239 (2017).
Wegrzyn, Martin, Vogt, Maria, Kireclioglu, Berna, Schneider, Julia, and Kißler, Johanna. “Mapping the emotional face. How individual face parts contribute to successful emotion recognition”. PLOS ONE 12.5 (2017): e0177239.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Main File(s)
Access Level
OA Open Access
Last Uploaded
2017-11-27T08:41:28Z

This data publication is cited in the following publications:
This publication cites the following data publications:

2 Citations in Europe PMC

Data provided by Europe PubMed Central.

Facial expression analysis with AFFDEX and FACET: A validation study.
Stöckli S, Schulte-Mecklenbeck M, Borer S, Samson AC., Behav Res Methods (), 2017
PMID: 29218587

Export

0 Marked Publications

Open Data PUB

Web of Science

View record in Web of Science®

Sources

PMID: 28493921
PubMed | Europe PMC

Search this title in

Google Scholar