首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Rachael E. Jack 《Visual cognition》2013,21(9-10):1248-1286
With over a century of theoretical developments and empirical investigation in broad fields (e.g., anthropology, psychology, evolutionary biology), the universality of facial expressions of emotion remains a central debate in psychology. How near or far, then, is this debate from being resolved? Here, I will address this question by highlighting and synthesizing the significant advances in the field that have elevated knowledge of facial expression recognition across cultures. Specifically, I will discuss the impact of early major theoretical and empirical contributions in parallel fields and their later integration in modern research. With illustrative examples, I will show that the debate on the universality of facial expressions has arrived at a new juncture and faces a new generation of exciting questions.  相似文献   

2.
Shame and guilt are closely related self-conscious emotions of negative affect that give rise to divergent self-regulatory and motivational behaviours. While guilt-proneness has demonstrated positive relationships with self-report measures of empathy and adaptive interpersonal functioning, shame-proneness tends to be unrelated or inversely related to empathy and is associated with interpersonal difficulties. At present, no research has examined relationships between shame and guilt-proneness with facial emotion recognition ability. Participants (N?=?363) completed measures of shame and guilt-proneness along with a facial emotion recognition task which assessed the ability to identify displays of anger, sadness, happiness, fear, disgust, and shame. Guilt-proneness was consistently positively associated with facial emotion recognition ability. In contrast, shame-proneness was unrelated to capacity for facial emotion recognition. Findings provide support for theory arguing that guilt and empathy operate synergistically and may also help explain the inverse relationship between guilt-proneness and propensity for aggressive behaviour.  相似文献   

3.
In social dilemmas, verbal communication of one's intentions is an important factor in increasing cooperation. In addition to verbal communication of one's intentions, also the communication of emotions of anger and happiness can influence cooperative behavior. In the present paper, we argue that facial expressions of emotion moderate verbal communication in social dilemmas. More specifically, three experiments showed that if the other person displayed happiness he or she was perceived as honest, trustworthy, and reliable, and cooperation was increased when verbal communication was cooperative rather than self‐interested. However, if the other person displayed anger, verbal communication did not influence people's decision behavior. Results also showed interactive effects on people's perceptions of trustworthiness, which partially mediated decision behavior. These findings suggest that emotion displays have an important function in organizational settings because they are able to influence social interactions and cooperative behavior. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

4.
5.
The present study examined whether information processing bias against emotional facial expressions is present among individuals with social anxiety. College students with high (high social anxiety group; n  = 26) and low social anxiety (low social anxiety group; n  = 26) performed three different types of working memory tasks: (a) ordering positive and negative facial expressions according to the intensity of emotion; (b) ordering pictures of faces according to age; and (c) ordering geometric shapes according to size. The high social anxiety group performed significantly more poorly than the low social anxiety group on the facial expression task, but not on the other two tasks with the nonemotional stimuli. These results suggest that high social anxiety interferes with processing of emotionally charged facial expressions.  相似文献   

6.
如何揭示情绪性面孔加工的认知神经机制一直是心理学和社会神经科学的热点课题。以往研究主要采用单独面孔表情作为情绪诱发或呈现方式, 但对群体情绪知觉与体验的关注极其缺乏, 而群体面孔表情作为群体情绪的主要表达方式, 亟待深入关注。因此, 本项目将采用群体面孔(面孔群)表情作为群体情绪刺激, 拟通过事件相关电位(ERP)、核磁共振(fMRI)以及经颅磁刺激(TMS)等技术结合行为研究, 尝试从情绪信息(效价和强度)、朝向信息(正面、侧面、倒置)、完整性(局部呈现、完整呈现)、空间频率信息(完整、高频、低频)等方面探明群体面孔表情加工的时间动态特征和大脑激活模式。这将有助于全面认识和深入了解群体情绪识别的一般规律, 对于更好地优化社会互动也具有现实意义。  相似文献   

7.
张凯莉  张琴  周静  王沛 《心理科学进展》2017,(11):1955-1963
认知者往往依据陌生个体面孔所携带的性别、年龄、种族等多重社会范畴信息对其进行加工,以期快速识别和了解他人。在基于面孔识别的多重社会范畴加工过程中,亚范畴间存在着复杂的交互作用。研究者分别采用"Who Said What"范式、重复启动范式、加纳选择注意范式、鼠标追踪范式等方法,发现亚范畴间的内隐加工具有彼此削弱的特性,外显加工存在交互影响的不对称性和偏差性。动态交互理论对此进行了进一步的理论分析与阐释。今后需更加科学地区分社会范畴加工的各个阶段,凸显内隐和外显加工的区别与联系;同时进一步整合各研究范式,克服方法异质导致的结果偏差甚至矛盾。  相似文献   

8.
9.
People who explain why ambiguous faces are expressing anger perceive and remember those faces as angrier than do people who explain why the same faces are expressing sadness. This phenomenon may be explained by a two-stage process in which language decomposes a facial configuration into its component features, which are then reintegrated with emotion categories available in the emotion explanation itself. This configural-decomposition hypothesis is consistent with experimental results showing that the explanation effect is attenuated when configural face processing is impaired (e.g., when the faces are inverted). Ironically, although people explain emotional expressions to make more accurate attributions, the process of explanation itself can decrease accuracy by leading to perceptual assimilation of the expressions to the emotions being explained.  相似文献   

10.
In the present study we examined the neural correlates of facial emotion processing in the first year of life using ERP measures and cortical source analysis. EEG data were collected cross‐sectionally from 5‐ (N = 49), 7‐ (N = 50), and 12‐month‐old (N = 51) infants while they were viewing images of angry, fearful, and happy faces. The N290 component was found to be larger in amplitude in response to fearful and happy than angry faces in all posterior clusters and showed largest response to fear than the other two emotions only over the right occipital area. The P400 and Nc components were found to be larger in amplitude in response to angry than happy and fearful faces over central and frontal scalp. Cortical source analysis of the N290 component revealed greater cortical activation in the right fusiform face area in response to fearful faces. This effect started to emerge at 5 months and became well established at 7 months, but it disappeared at 12 months. The P400 and Nc components were primarily localized to the PCC/Precuneus where heightened responses to angry faces were observed. The current results suggest the detection of a fearful face in infants’ brain can happen shortly (~200–290 ms) after the stimulus onset, and this process may rely on the face network and develop substantially between 5 to 7 months of age. The current findings also suggest the differential processing of angry faces occurred later in the P400/Nc time window, which recruits the PCC/Precuneus and is associated with the allocation of infants’ attention.  相似文献   

11.
The human face conveys important social signals when people interact in social contexts. The current study investigated the relationship between face recognition and emotional intelligence, and how societal factors of emotion and race influence people's face recognition. Participants’ recognition accuracy, reaction time, sensitivity, and response bias were measured to examine their face‐processing ability. Fifty Caucasian undergraduates (38 females, 12 males; average age = 21.76 years) participated in a face recognition task in which they discriminated previously presented target faces from novel distractor faces. A positive correlation between participants’ emotional intelligence scores and their performance on the face recognition task was observed, suggesting that face recognition ability was associated with emotional or social intelligence. Additionally, Caucasian participants recognized happy faces better than angry or neutral faces. It was also observed that people recognized Asian faces better than Caucasian ones, which appears to be contradictory to the classic other‐race effect. The present study suggests that some societal factors could influence face processing, and face recognition ability could in turn predict social intelligence.  相似文献   

12.
13.
While the recognition of emotional expressions has been extensively studied, the behavioural response to these expressions has not. In the interpersonal circumplex, behaviour is defined in terms of communion and agency. In this study, we examined behavioural responses to both facial and postural expressions of emotion. We presented 101 Romanian students with facial and postural stimuli involving individuals (‘targets’) expressing happiness, sadness, anger, or fear. Using an interpersonal grid, participants simultaneously indicated how communal (i.e., quarrelsome or agreeable) and agentic (i.e., dominant or submissive) they would be towards people displaying these expressions. Participants were agreeable‐dominant towards targets showing happy facial expressions and primarily quarrelsome towards targets with angry or fearful facial expressions. Responses to targets showing sad facial expressions were neutral on both dimensions of interpersonal behaviour. Postural versus facial expressions of happiness and anger elicited similar behavioural responses. Participants responded in a quarrelsome‐submissive way to fearful postural expressions and in an agreeable way to sad postural expressions. Behavioural responses to the various facial expressions were largely comparable to those previously observed in Dutch students. Observed differences may be explained from participants’ cultural background. Responses to the postural expressions largely matched responses to the facial expressions.  相似文献   

14.
15.
People with multiple sclerosis (MS) can experience problems in interpreting others’ emotions from faces or voices. However, to date little is known about whether difficulties in emotion perception in MS are related to broader aspects of social functioning. Also, there are few studies reporting the effect of MS on more ecologically valid assessments of emotion perception using multimodal videos. The current study looks at (1) the effect of MS on perceiving emotions from faces, voices and multimodal videos; (2) the possible role of slowed processing and executive dysfunction in emotion perception problems in MS and (3) the relationship between emotion perception and broader social functioning in MS. 53 people with MS and 31 healthy controls completed tasks of emotion perception and cognition, and assessed their levels of social support and social participation. Participants with MS performed worse than demographically matched controls on all measures of emotion perception. Emotion perception performance was related to cognitive measures in those with MS. Also, significant associations were found between emotion perception difficulties in MS and poorer social function. In particular, people with MS who had poorer emotion perception also reported lower levels of social support from their friends, and regression analysis showed that this prediction was maintained even when disease severity and cognitive function were taken into account. These results show that problems with emotion perception in MS extend to more realistic tasks and may predict key aspects of social functioning.  相似文献   

16.
A new algorithm for multidimensional scaling analysis of sorting data and hierarchical-sorting data is tested by applying it to facial expressions of emotion. We construct maps in “facial expression space” for two sets of still photographs: the I-FEEL series (expressions displayed spontaneously by infants and young children), and a subset of the Lightfoot series (posed expressions, all from one actress). The analysis avoids potential artefacts by fitting a map directly to the subject's judgments, rather than transforming the data into a matrix of estimated dissimilarities as an intermediate step. The results for both stimulus sets display an improvement in the extent to which they agree with existing maps. Some points emerge about the limitations of sorting data and the need for caution when interpreting MDS configurations derived from them.  相似文献   

17.
We examined the relationship between experienced positive/negative affect and cardiac reactivity and facial muscle movements during laboratory tasks with different demands. Heart rate, respiratory sinus arrhythmia, pre-ejection period, and facial electromyography were measured during startle, mental arithmetic, reaction time task, and speech task. The results revealed that individuals experiencing high levels of positive affect exhibited more pronounced parasympathetic, heart rate, and orbicularis oculi reactivity than others. Individuals who experienced high levels of negative affects during the tasks showed higher corrugator supercilii responses. Men and women showed slightly different response patterns. To conclude, cardiac reactivity may be associated with positive involvement and enthusiasm in some situations and all reactivity should not automatically be considered as potentially pathological.  相似文献   

18.
Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non‐emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD.  相似文献   

19.
This study investigated whether subjects high and low in public speaking fear react with different facial electromyographic (EMG) activities when exposed to negative and positive social stimuli. A High-fear and Low-fear group were selected by help of a questionnaire and were exposed to slides of angry and happy faces while facial-EMG from the corrugator and zygomatic muscle regions were measured. The subjects also rated the stimuli on different emotional dimensions. Consistent with earlier research it was found that Low fear subjects reacted with increased corrugator activity to angry faces and increased zygomatic activity to happy faces. The High fear group, on the other hand, did not distinguish between angry and happy faces. Rating data indicated that the High fear group perceived angry faces as being emotionally more negative. The present results are consistent with earlier studies, indicating that the facial-EMG technique is sensitive to detect differential responding among clinical interesting groups, such as people suffering from social fears.  相似文献   

20.
Research suggests that infants progress from discrimination to recognition of emotions in faces during the first half year of life. It is unknown whether the perception of emotions from bodies develops in a similar manner. In the current study, when presented with happy and angry body videos and voices, 5-month-olds looked longer at the matching video when they were presented upright but not when they were inverted. In contrast, 3.5-month-olds failed to match even with upright videos. Thus, 5-month-olds but not 3.5-month-olds exhibited evidence of recognition of emotions from bodies by demonstrating intermodal matching. In a subsequent experiment, younger infants did discriminate between body emotion videos but failed to exhibit an inversion effect, suggesting that discrimination may be based on low-level stimulus features. These results document a developmental change from discrimination based on non-emotional information at 3.5 months to recognition of body emotions at 5 months. This pattern of development is similar to face emotion knowledge development and suggests that both the face and body emotion perception systems develop rapidly during the first half year of life.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号