Audiovisual speech from emotionally expressive and lateralized faces |
| |
Authors: | Gordon Michael S Hibberts Mary |
| |
Affiliation: | University of South Alabama, Mobile, AL, USA. gordonm10@wpunj.edu |
| |
Abstract: | Emotional expression and how it is lateralized across the two sides of the face may influence how we detect audiovisual speech. To investigate how these components interact we conducted experiments comparing the perception of sentences expressed with happy, sad, and neutral emotions. In addition we isolated the facial asymmetries for affective and speech processing by independently testing the two sides of a talker's face. These asymmetrical differences were exaggerated using dynamic facial chimeras in which left- or right-face halves were paired with their mirror image during speech production. Results suggest that there are facial asymmetries in audiovisual speech such that the right side of the face and right-facial chimeras supported better speech perception than their left-face counterparts. Affective information was also found to be critical in that happy expressions tended to improve speech performance on both sides of the face relative to all other emotions, whereas sad emotions generally inhibited visual speech information, particularly from the left side of the face. The results suggest that approach information may facilitate visual and auditory speech detection. |
| |
Keywords: | |
本文献已被 PubMed 等数据库收录! |
|