首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.  相似文献   

2.
In a sample of 325 college students, we examined how context influences judgments of facial expressions of emotion, using a newly developed facial affect recognition task in which emotional faces are superimposed upon emotional and neutral contexts. This research used a larger sample size than previous studies, included more emotions, varied the intensity level of the expressed emotion to avoid potential ceiling effects from very easy recognition, did not explicitly direct attention to the context, and aimed to understand how recognition is influenced by non-facial information, both situationally-relevant and situationally-irrelevant. Both accuracy and RT varied as a function of context. For all facial expressions of emotion other than happiness, accuracy increased when the emotion of the face and context matched, and decreased when they mismatched. For all emotions, participants responded faster when the emotion of the face and image matched and slower when they mismatched. Results suggest that the judgment of the facial expression is itself influenced by the contextual information instead of both being judged independently and then combined. Additionally, the results have implications for developing models of facial affect recognition and indicate that there are factors other than the face that can influence facial affect recognition judgments.  相似文献   

3.
Recognition of facial affect in Borderline Personality Disorder   总被引:1,自引:0,他引:1  
Patients with Borderline Personality Disorder (BPD) have been described as emotionally hyperresponsive, especially to anger and fear in social contexts. The aim was to investigate whether BPD patients are more sensitive but less accurate in terms of basic emotion recognition, and show a bias towards perceiving anger and fear when evaluating ambiguous facial expressions. Twenty-five women with BPD were compared with healthy controls on two different facial emotion recognition tasks. The first task allowed the assessment of the subjective detection threshold as well as the number of evaluation errors on six basic emotions. The second task assessed a response bias to blends of basic emotions. BPD patients showed no general deficit on the affect recognition task, but did show enhanced learning over the course of the experiment. For ambiguous emotional stimuli, we found a bias towards the perception of anger in the BPD patients but not towards fear. BPD patients are accurate in perceiving facial emotions, and are probably more sensitive to familiar facial expressions. They show a bias towards perceiving anger, when socio-affective cues are ambiguous. Interpersonal training should focus on the differentiation of ambiguous emotion in order to reduce a biased appraisal of others.  相似文献   

4.
Recognizing emotion in faces: developmental effects of child abuse and neglect   总被引:12,自引:0,他引:12  
The contributions to the recognition of emotional signals of (a) experience and learning versus (b) internal predispositions are difficult to investigate because children are virtually always exposed to complex emotional experiences from birth. The recognition of emotion among physically abused and physically neglected preschoolers was assessed in order to examine the effects of atypical experience on emotional development. In Experiment 1, children matched a facial expression to an emotional situation. Neglected children had more difficulty discriminating emotional expressions than did control or physically abused children. Physically abused children displayed a response bias for angry facial expressions. In Experiment 2, children rated the similarity of facial expressions. Control children viewed discrete emotions as dissimilar, neglected children saw fewer distinctions between emotions, and physically abused children showed the most variance across emotions. These results suggest that to the extent that children's experience with the world varies, so too will their interpretation and understanding of emotional signals.  相似文献   

5.
This study explored a bidirectional impact on the recognition accuracy of various facial expressions deriving from both the observer and sender in a sample of Chinese participants. A facial manipulation task was used to examine the ability of an observer's facial feedback to modulate the recognition of various facial expressions. Furthermore, the effect of a sender's facial expression with an open or closed mouth on recognition accuracy was investigated. The results showed that only recognition accuracy of a sad facial expression was influenced simultaneously by bidirectional sources from a sender and observer. Moreover, the impact of the unidirectional cue of a sender's facial feature (i.e., mouth openness) on happy and neutral faces was found to influence the recognition accuracy of these faces, but not the observer's bodily state. These findings indicate that the bidirectional impact derived from an observer and sender on facial expression recognition accuracy differs for emotional and neutral expressions.  相似文献   

6.
胡治国  刘宏艳 《心理科学》2015,(5):1087-1094
正确识别面部表情对成功的社会交往有重要意义。面部表情识别受到情绪背景的影响。本文首先介绍了情绪背景对面部表情识别的增强作用,主要表现为视觉通道的情绪一致性效应和跨通道情绪整合效应;然后介绍了情绪背景对面部表情识别的阻碍作用,主要表现为情绪冲突效应和语义阻碍效应;接着介绍了情绪背景对中性和歧义面孔识别的影响,主要表现为背景的情绪诱发效应和阈下情绪启动效应;最后对现有研究进行了总结分析,提出了未来研究的建议。  相似文献   

7.
The ability to interpret emotions in facial expressions is crucial for social functioning across the lifespan. Facial expression recognition develops rapidly during infancy and improves with age during the preschool years. However, the developmental trajectory from late childhood to adulthood is less clear. We tested older children, adolescents and adults on a two-alternative forced-choice discrimination task using morphed faces that varied in emotional content. Actors appeared to pose expressions that changed incrementally along three progressions: neutral-to-fear, neutral-to-anger, and fear-to-anger. Across all three morph types, adults displayed more sensitivity to subtle changes in emotional expression than children and adolescents. Fear morphs and fear-to-anger blends showed a linear developmental trajectory, whereas anger morphs showed a quadratic trend, increasing sharply from adolescents to adults. The results provide evidence for late developmental changes in emotional expression recognition with some specificity in the time course for distinct emotions.  相似文献   

8.
Research on emotion recognition has been dominated by studies of photographs of facial expressions. A full understanding of emotion perception and its neural substrate will require investigations that employ dynamic displays and means of expression other than the face. Our aims were: (i) to develop a set of dynamic and static whole-body expressions of basic emotions for systematic investigations of clinical populations, and for use in functional-imaging studies; (ii) to assess forced-choice emotion-classification performance with these stimuli relative to the results of previous studies; and (iii) to test the hypotheses that more exaggerated whole-body movements would produce (a) more accurate emotion classification and (b) higher ratings of emotional intensity. Ten actors portrayed 5 emotions (anger, disgust, fear, happiness, and sadness) at 3 levels of exaggeration, with their faces covered. Two identical sets of 150 emotion portrayals (full-light and point-light) were created from the same digital footage, along with corresponding static images of the 'peak' of each emotion portrayal. Recognition tasks confirmed previous findings that basic emotions are readily identifiable from body movements, even when static form information is minimised by use of point-light displays, and that full-light and even point-light displays can convey identifiable emotions, though rather less efficiently than dynamic displays. Recognition success differed for individual emotions, corroborating earlier results about the importance of distinguishing differences in movement characteristics for different emotional expressions. The patterns of misclassifications were in keeping with earlier findings on emotional clustering. Exaggeration of body movement (a) enhanced recognition accuracy, especially for the dynamic point-light displays, but notably not for sadness, and (b) produced higher emotional-intensity ratings, regardless of lighting condition, for movies but to a lesser extent for stills, indicating that intensity judgments of body gestures rely more on movement (or form-from-movement) than static form information.  相似文献   

9.
The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.  相似文献   

10.
躯体和面孔是个体情绪识别的敏感线索。与面部表情的早期视觉加工相似, P1成分对恐惧、愤怒等负性躯体表情更加敏感, 反映了对躯体威胁信息快速且无意识的加工。情绪躯体和面孔还有着类似的构型加工, 表现为二者都能诱发颞枕区视觉皮层相似的N170成分, 但涉及的神经基础并不完全相同。在构型编码加工中, 面部表情的N170与顶中正成分(Vertex Positive Potential, VPP)较躯体表情的N170、VPP更加明显。在面部表情和躯体表情的后期加工阶段, 早期后部负波(Early Posterior Negativity, EPN)反映了面孔和躯体视觉编码的注意指向加工, 随后出现的P3与晚期正成分(Late Positive Component, LPC)代表了顶额皮层对复杂情绪信息的高级认知加工。躯体表情还存在与外纹状皮层躯体区相关的N190成分, 其对躯体的情绪和动作信息敏感。今后的研究应进一步探讨动作对情绪知觉的影响、动态面孔−躯体情绪的加工机制等。  相似文献   

11.
注意捕获是指与任务无关的刺激能够不自觉地吸引注意的现象。实验一采用视觉搜索任务,考察与主任务无关的情绪面孔的注意捕获水平及其机制,实验二进一步探究时间任务需求对无关情绪面孔注意捕获的影响。结果发现:与其他情绪面孔相比,愤怒的情绪面孔捕获了更多的注意,且受到整体情绪加工的影响;时间任务需求影响了目标刺激的注意选择,但愤怒优势效应不受到时间任务需求的影响,因此可能是一种较为自动化的加工过程。  相似文献   

12.
白鹭  毛伟宾  王蕊  张文海 《心理学报》2017,(9):1172-1183
本研究以消极情绪间感知相似性较低的厌恶、恐惧面孔表情为材料,提供5个情绪性语言标签减少文字背景对面孔识别的促进作用,通过2个实验对自然场景以及身体动作对面孔表情识别的影响进行了研究,旨在考察面孔表情与自然场景间的情绪一致性对情绪面孔识别和自然场景加工的影响,以及加入与自然场景情绪相冲突的身体动作对面孔表情识别可能产生的影响。研究结果表明:(1)尽管增加了情绪性语言标签选项数量,自然场景的情绪对面孔表情识别的影响依旧显著;(2)当面孔表情与自然场景情绪不一致时,面孔识别需要更多依赖对自然场景的加工,因此对自然场景的加工程度更高;(3)身体动作会在一定程度上干扰自然场景对面孔表情识别的影响,但自然场景依然对情绪面孔的表情识别有重要作用。  相似文献   

13.
We examined how the recognition of facial emotion was influenced by manipulation of both spatial and temporal properties of 3-D point-light displays of facial motion. We started with the measurement of 3-D position of multiple locations on the face during posed expressions of anger, happiness, sadness, and surprise, and then manipulated the spatial and temporal properties of the measurements to obtain new versions of the movements. In two experiments, we examined recognition of these original and modified facial expressions: in experiment 1, we manipulated the spatial properties of the facial movement, and in experiment 2 we manipulated the temporal properties. The results of experiment 1 showed that exaggeration of facial expressions relative to a fixed neutral expression resulted in enhanced ratings of the intensity of that emotion. The results of experiment 2 showed that changing the duration of an expression had a small effect on ratings of emotional intensity, with a trend for expressions with shorter durations to have lower ratings of intensity. The results are discussed within the context of theories of encoding as related to caricature and emotion.  相似文献   

14.
躯体和面孔是个体情绪表达与识别的重要线索。与面部表情相比,躯体表情加工的显著特点是补偿情绪信息,感知运动与行为信息,及产生适应性行为。情绪躯体与面孔加工的神经基础可能相邻或部分重合,但也存在分离;EBA、FBA、SPL、IPL等是与躯体表情加工相关的特异性脑区。今后应系统研究面孔、躯体及语音情绪线索加工潜在的神经基础,探讨躯体情绪加工的跨文化差异,考察情绪障碍患者的躯体表情加工特点。  相似文献   

15.
Four experiments investigated priming of emotion recognition using a range of emotional stimuli, including facial expressions, words, pictures, and nonverbal sounds. In each experiment, a prime-target paradigm was used with related, neutral, and unrelated pairs. In Experiment 1, facial expression primes preceded word targets in an emotion classification task. A pattern of priming of emotional word targets by related primes with no inhibition of unrelated primes was found. Experiment 2 reversed these primes and targets and found the same pattern of results, demonstrating bidirectional priming between facial expressions and words. Experiment 2 also found priming of facial expression targets by picture primes. Experiment 3 demonstrated that priming occurs not just between pairs of stimuli that have a high co-occurrence in the environment (for example, nonverbal sounds and facial expressions), but with stimuli that co-occur less frequently and are linked mainly by their emotional category (for example, nonverbal sounds and printed words). This shows the importance of the prime and target sharing a common emotional category, rather than their previous co-occurrence. Experiment 4 extended the findings by showing that there are category-based effects as well as valence effects in emotional priming, supporting a categorical view of emotion recognition.  相似文献   

16.
Very few large-scale studies have focused on emotional facial expression recognition (FER) in 3-year-olds, an age of rapid social and language development. We studied FER in 808 healthy 3-year-olds using verbal and nonverbal computerized tasks for four basic emotions (happiness, sadness, anger, and fear). Three-year-olds showed differential performance on the verbal and nonverbal FER tasks, especially with respect to fear. That is to say, fear was one of the most accurately recognized facial expressions as matched nonverbally and the least accurately recognized facial expression as labeled verbally. Sex did not influence emotion-matching nor emotion-labeling performance after adjusting for basic matching or labeling ability. Three-year-olds made systematic errors in emotion-labeling. Namely, happy expressions were often confused with fearful expressions, whereas negative expressions were often confused with other negative expressions. Together, these findings suggest that 3-year-olds' FER skills strongly depend on task specifications. Importantly, fear was the most sensitive facial expression in this regard. Finally, in line with previous studies, we found that recognized emotion categories are initially broad, including emotions of the same valence, as reflected in the nonrandom errors of 3-year-olds.  相似文献   

17.
Abstract

Chronic schizophrenics are known to manifest a deficit of categorisation and recognition of primary emotional facial expression despite intact recognition of face identity. An equivalent deficit of expression of the same primary facial emotions in schizophrenics has not been clearly established. Twenty chronic hospitalised schizophrenics and 20 normals were therefore tested on tasks of facial emotional expression upon verbal command, of facial emotional expression imitation, and of non-affective bucco-facial praxic imitation. Results indicate that chronic schizophrenics do manifest a deficit of facial emotion expression which can best be explained by task parameters, such as verbal cueing of emotions, perceptual recognition, and bucco-facial dyspraxia in decreasing order of importance. The deficit does not appear to result from neuroleptic or anticholinergic medication nor length of hospitalisation or disease.  相似文献   

18.
Three experiments examined 3- and 5-year-olds’ recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression remained neutral (Experiment 1) or varied between immediate and delayed tests: from neutral to smile and anger (Experiment 2), from smile to neutral and anger (Experiment 3, condition 1), or from anger to neutral and smile (Experiment 3, condition 2). In all experiments, immediate face recognition was not influenced by emotional expression for either age group. Delayed face recognition was most accurate for faces in identical emotional expression. For 5-year-olds, delayed face recognition (with varied emotional expression) was not influenced by which emotional expression had been displayed during the immediate recognition test. Among 3-year-olds, accuracy decreased when facial expressions varied from neutral to smile and anger but was constant when facial expressions varied from anger or smile to neutral, smile or anger. Three-year-olds’ recognition was facilitated when faces initially displayed smile or anger expressions, but this was not the case for 5-year-olds. Results thus indicate a developmental progression in face identity recognition with varied emotional expressions between ages 3 and 5.  相似文献   

19.
吴彬星  张智君  孙雨生 《心理学报》2015,47(10):1201-1212
对于面孔性别与表情的关系, 目前的理论尚不完善。而众多研究证据表明, 面孔熟悉度与面孔性别及表情的加工均有密切关系。本研究基于Garner范式考察了在不同面孔熟悉度下面孔性别与表情的相互关系。共包括4项实验:实验1, 面孔刺激的身份陌生且不重复, 刺激在Garner范式的控制组和正交组中均仅呈现一次, 面孔熟悉度低; 实验2, 除面孔刺激的身份重复外, 其余均同实验1, 面孔熟悉度中等; 实验3, 面孔刺激的身份陌生且不重复, 但分别在控制组和正交组中重复呈现多次, 面孔熟悉度高; 实验4, 通过面孔学习增加面孔的熟悉度, 以直接验证面孔熟悉度的增加对面孔性别与表情相互关系的影响。结果发现:对于陌生面孔, 表情单向影响面孔性别的加工; 随着面孔熟悉度的增加, 面孔性别与表情之间出现双向的影响。因此, 面孔熟悉度对面孔性别与表情的相互影响具有调节作用。  相似文献   

20.
Theoretical accounts suggest an increased and automatic neural processing of emotional, especially threat-related, facial expressions and emotional prosody. In line with this assumption, several functional imaging studies showed activation to threat-related faces and voices in subcortical and cortical brain areas during attentional distraction or unconscious stimulus processing. Furthermore, electrophysiological studies provided evidence for automatic early brain responses to emotional facial expressions and emotional prosody. However, there is increasing evidence that available cognitive resources modulate brain responses to emotional signals from faces and voices, even though conflicting findings may occur depending on contextual factors, specific emotions, sensory modality, and neuroscientific methods used. The current review summarizes these findings and suggests that further studies should combine information from different sensory modalities and neuroscientific methods such as functional neuroimaging and electrophysiology. Furthermore, it is concluded that the variable saliency and relevance of emotional social signals on the one hand and available cognitive resources on the other hand interact in a dynamic manner, making absolute boundaries of the automatic processing of emotional information from faces and voices unlikely.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号