首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Research on emotion recognition has been dominated by studies of photographs of facial expressions. A full understanding of emotion perception and its neural substrate will require investigations that employ dynamic displays and means of expression other than the face. Our aims were: (i) to develop a set of dynamic and static whole-body expressions of basic emotions for systematic investigations of clinical populations, and for use in functional-imaging studies; (ii) to assess forced-choice emotion-classification performance with these stimuli relative to the results of previous studies; and (iii) to test the hypotheses that more exaggerated whole-body movements would produce (a) more accurate emotion classification and (b) higher ratings of emotional intensity. Ten actors portrayed 5 emotions (anger, disgust, fear, happiness, and sadness) at 3 levels of exaggeration, with their faces covered. Two identical sets of 150 emotion portrayals (full-light and point-light) were created from the same digital footage, along with corresponding static images of the 'peak' of each emotion portrayal. Recognition tasks confirmed previous findings that basic emotions are readily identifiable from body movements, even when static form information is minimised by use of point-light displays, and that full-light and even point-light displays can convey identifiable emotions, though rather less efficiently than dynamic displays. Recognition success differed for individual emotions, corroborating earlier results about the importance of distinguishing differences in movement characteristics for different emotional expressions. The patterns of misclassifications were in keeping with earlier findings on emotional clustering. Exaggeration of body movement (a) enhanced recognition accuracy, especially for the dynamic point-light displays, but notably not for sadness, and (b) produced higher emotional-intensity ratings, regardless of lighting condition, for movies but to a lesser extent for stills, indicating that intensity judgments of body gestures rely more on movement (or form-from-movement) than static form information.  相似文献   

2.
Research suggests that infants progress from discrimination to recognition of emotions in faces during the first half year of life. It is unknown whether the perception of emotions from bodies develops in a similar manner. In the current study, when presented with happy and angry body videos and voices, 5-month-olds looked longer at the matching video when they were presented upright but not when they were inverted. In contrast, 3.5-month-olds failed to match even with upright videos. Thus, 5-month-olds but not 3.5-month-olds exhibited evidence of recognition of emotions from bodies by demonstrating intermodal matching. In a subsequent experiment, younger infants did discriminate between body emotion videos but failed to exhibit an inversion effect, suggesting that discrimination may be based on low-level stimulus features. These results document a developmental change from discrimination based on non-emotional information at 3.5 months to recognition of body emotions at 5 months. This pattern of development is similar to face emotion knowledge development and suggests that both the face and body emotion perception systems develop rapidly during the first half year of life.  相似文献   

3.
Clarke TJ  Bradshaw MF  Field DT  Hampson SE  Rose D 《Perception》2005,34(10):1171-1180
We examined whether it is possible to identify the emotional content of behaviour from point-light displays where pairs of actors are engaged in interpersonal communication. These actors displayed a series of emotions, which included sadness, anger, joy, disgust, fear, and romantic love. In experiment 1, subjects viewed brief clips of these point-light displays presented the right way up and upside down. In experiment 2, the importance of the interaction between the two figures in the recognition of emotion was examined. Subjects were shown upright versions of (i) the original pairs (dyads), (ii) a single actor (monad), and (iii) a dyad comprising a single actor and his/her mirror image (reflected dyad). In each experiment, the subjects rated the emotional content of the displays by moving a slider along a horizontal scale. All of the emotions received a rating for every clip. In experiment 1, when the displays were upright, the correct emotions were identified in each case except disgust; but, when the displays were inverted, performance was significantly diminished for some emotions. In experiment 2, the recognition of love and joy was impaired by the absence of the acting partner, and the recognition of sadness, joy, and fear was impaired in the non-veridical (mirror image) displays. These findings both support and extend previous research by showing that biological motion is sufficient for the perception of emotion, although inversion affects performance. Moreover, emotion perception from biological motion can be affected by the veridical or non-veridical social context within the displays.  相似文献   

4.
In daily experience, children have access to a variety of cues to others’ emotions, including face, voice, and body posture. Determining which cues they use at which ages will help to reveal how the ability to recognize emotions develops. For happiness, sadness, anger, and fear, preschoolers (3-5 years, N = 144) were asked to label the emotion conveyed by dynamic cues in four cue conditions. The Face-only, Body Posture-only, and Multi-cue (face, body, and voice) conditions all were well recognized (M > 70%). In the Voice-only condition, recognition of sadness was high (72%), but recognition of the three other emotions was significantly lower (34%).  相似文献   

5.
Correctly perceiving emotions in others is a crucial part of social interactions. We constructed a set of dynamic stimuli to determine the relative contributions of the face and body to the accurate perception of basic emotions. We also manipulated the length of these dynamic stimuli in order to explore how much information is needed to identify emotions. The findings suggest that even a short exposure time of 250 milliseconds provided enough information to correctly identify an emotion above the chance level. Furthermore, we found that recognition patterns from the face alone and the body alone differed as a function of emotion. These findings highlight the role of the body in emotion perception and suggest an advantage for angry bodies, which, in contrast to all other emotions, were comparable to the recognition rates from the face and may be advantageous for perceiving imminent threat from a distance.  相似文献   

6.
Shame and guilt are closely related self-conscious emotions of negative affect that give rise to divergent self-regulatory and motivational behaviours. While guilt-proneness has demonstrated positive relationships with self-report measures of empathy and adaptive interpersonal functioning, shame-proneness tends to be unrelated or inversely related to empathy and is associated with interpersonal difficulties. At present, no research has examined relationships between shame and guilt-proneness with facial emotion recognition ability. Participants (N?=?363) completed measures of shame and guilt-proneness along with a facial emotion recognition task which assessed the ability to identify displays of anger, sadness, happiness, fear, disgust, and shame. Guilt-proneness was consistently positively associated with facial emotion recognition ability. In contrast, shame-proneness was unrelated to capacity for facial emotion recognition. Findings provide support for theory arguing that guilt and empathy operate synergistically and may also help explain the inverse relationship between guilt-proneness and propensity for aggressive behaviour.  相似文献   

7.
Recognition of facial affect in Borderline Personality Disorder   总被引:1,自引:0,他引:1  
Patients with Borderline Personality Disorder (BPD) have been described as emotionally hyperresponsive, especially to anger and fear in social contexts. The aim was to investigate whether BPD patients are more sensitive but less accurate in terms of basic emotion recognition, and show a bias towards perceiving anger and fear when evaluating ambiguous facial expressions. Twenty-five women with BPD were compared with healthy controls on two different facial emotion recognition tasks. The first task allowed the assessment of the subjective detection threshold as well as the number of evaluation errors on six basic emotions. The second task assessed a response bias to blends of basic emotions. BPD patients showed no general deficit on the affect recognition task, but did show enhanced learning over the course of the experiment. For ambiguous emotional stimuli, we found a bias towards the perception of anger in the BPD patients but not towards fear. BPD patients are accurate in perceiving facial emotions, and are probably more sensitive to familiar facial expressions. They show a bias towards perceiving anger, when socio-affective cues are ambiguous. Interpersonal training should focus on the differentiation of ambiguous emotion in order to reduce a biased appraisal of others.  相似文献   

8.
In two studies, the robustness of anger recognition of bodily expressions is tested. In the first study, video recordings of an actor expressing four distinct emotions (anger, despair, fear, and joy) were structurally manipulated as to image impairment and body segmentation. The results show that anger recognition is more robust than other emotions to image impairment and to body segmentation. Moreover, the study showed that arms expressing anger were more robustly recognised than arms expressing other emotions. Study 2 added face blurring as a variable to the bodily expressions and showed that it decreased accurate emotion recognition—but more for recognition of joy and despair than for anger and fear. In sum, the paper indicates the robustness of anger recognition in multileveled deteriorated bodily expressions.  相似文献   

9.
躯体和面孔是个体情绪识别的敏感线索。与面部表情的早期视觉加工相似, P1成分对恐惧、愤怒等负性躯体表情更加敏感, 反映了对躯体威胁信息快速且无意识的加工。情绪躯体和面孔还有着类似的构型加工, 表现为二者都能诱发颞枕区视觉皮层相似的N170成分, 但涉及的神经基础并不完全相同。在构型编码加工中, 面部表情的N170与顶中正成分(Vertex Positive Potential, VPP)较躯体表情的N170、VPP更加明显。在面部表情和躯体表情的后期加工阶段, 早期后部负波(Early Posterior Negativity, EPN)反映了面孔和躯体视觉编码的注意指向加工, 随后出现的P3与晚期正成分(Late Positive Component, LPC)代表了顶额皮层对复杂情绪信息的高级认知加工。躯体表情还存在与外纹状皮层躯体区相关的N190成分, 其对躯体的情绪和动作信息敏感。今后的研究应进一步探讨动作对情绪知觉的影响、动态面孔−躯体情绪的加工机制等。  相似文献   

10.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions.  相似文献   

11.
In order to investigate the role of facial movement in the recognition of emotions, faces were covered with black makeup and white spots. Video recordings of such faces were played back so that only the white spots were visible. The results demonstrated that moving displays of happiness, sadness, fear, surprise, anger and disgust were recognized more accurately than static displays of the white spots at the apex of the expressions. This indicated that facial motion, in the absence of information about the shape and position of facial features, is informative about these basic emotions. Normally illuminated dynamic displays of these expressions, however, were recognized more accurately than displays of moving spots. The relative effectiveness of upper and lower facial areas for the recognition of these six emotions was also investigated using normally illuminated and spots-only displays. In both instances the results indicated that different facial regions are more informative for different emitions. The movement patterns characterizing the various emotional expressions as well as common confusions between emotions are also discussed.  相似文献   

12.
The current review introduces a new program of research that suggests the perception of spatial layout is influenced by emotions. Though perceptual systems are often described as closed and insulated, this review presents research suggesting that a variety of induced emotions (e.g., fear, disgust, sadness) can produce changes in vision and audition. Thus, the perceptual system may be highly interconnected, allowing emotional information to influence perceptions that, in turn, influence cognition. The body of work presented here also suggests that emotion-based changes in perception help us solve particular adaptive problems because emotion does not change all perceptions of the world. Taking the adaptive significance of emotion into account allows us to make predictions about when and how emotion influences perception.  相似文献   

13.
研究考察了42名大学生(中国21人,波兰21人)对男、女性用5种不同情绪声音(高兴、生气、害怕、难过和中性)表达的中性语义句子的情绪类型和强度判断,从而分析中国、波兰不同文化背景下,个体对基于声音线索的情绪知觉差异。结果表明:(1)中国被试对声音情绪类型判断的正确率以及情绪强度的评定上均高于波兰被试,说明在声音情绪知觉上存在组内优势;(2)所有被试对女性声音材料情绪类型识别的正确率以及情绪强度的评定均高于对男性声音材料;(3)在对情绪类型判断上,被试对害怕情绪识别的正确率高于对高兴、难过和中性情绪,对中性情绪识别的正确率最低;(4)在情绪强度评定上,被试对害怕情绪的评定强度高于对难过情绪,对高兴情绪的评定强度最低。  相似文献   

14.
This study investigated the hypothesis that different emotions are most effectively conveyed through specific, nonverbal channels of communication: body, face, and touch. Experiment 1 assessed the production of emotion displays. Participants generated nonverbal displays of 11 emotions, with and without channel restrictions. For both actual production and stated preferences, participants favored the body for embarrassment, guilt, pride, and shame; the face for anger, disgust, fear, happiness, and sadness; and touch for love and sympathy. When restricted to a single channel, participants were most confident about their communication when production was limited to the emotion's preferred channel. Experiment 2 examined the reception or identification of emotion displays. Participants viewed videos of emotions communicated in unrestricted and restricted conditions and identified the communicated emotions. Emotion identification in restricted conditions was most accurate when participants viewed emotions displayed via the emotion's preferred channel. This study provides converging evidence that some emotions are communicated predominantly through different nonverbal channels. Further analysis of these channel-emotion correspondences suggests that the social function of an emotion predicts its primary channel: The body channel promotes social-status emotions, the face channel supports survival emotions, and touch supports intimate emotions.  相似文献   

15.
王异芳  苏彦捷  何曲枝 《心理学报》2012,44(11):1472-1478
研究从言语的韵律和语义两条线索出发,试图探讨学前儿童基于声音线索情绪知觉的发展特点.实验一中,124名3~5岁儿童对男、女性用5种不同情绪(高兴、生气、害怕、难过和中性)的声音表达的中性语义句子进行了情绪类型上的判断.3~5岁儿童基于声音韵律线索情绪知觉能力随着年龄的增长不断提高,主要表现在生气、害怕和中性情绪上.不同情绪类型识别的发展轨迹不完全相同,总体来说,高兴的声音韵律最容易识别,而害怕是最难识别的.当韵律和语义线索冲突时,学前儿童更多地依赖韵律线索来判断说话者的情绪状态.被试对女性用声音表达的情绪更敏感.  相似文献   

16.
The presence of information in a visual display does not guarantee its use by the visual system. Studies of inversion effects in both face recognition and biological-motion perception have shown that the same information may be used by observers when it is presented in an upright display but not used when the display is inverted. In our study, we tested the inversion effect in scrambled biological-motion displays to investigate mechanisms that validate information contained in the local motion of a point-light walker. Using novel biological-motion stimuli that contained no configural cues to the direction in which a walker was facing, we found that manipulating the relative vertical location of the walker's feet significantly affected observers' performance on a direction-discrimination task. Our data demonstrate that, by themselves, local cues can almost unambiguously indicate the facing direction of the agent in biological-motion stimuli. Additionally, we document a noteworthy interaction between local and global information and offer a new explanation for the effect of local inversion in biological-motion perception.  相似文献   

17.
摘 要 面孔情绪识别过程中的多感觉通道效应指参与刺激加工的各个通道对面孔表情认知的综合影响。研究者们通过行为实验、事件相关电位以及脑成像技术等对该过程中的多个获取信息的通道进行研究,肢体表情、情绪性声音、特定气味能系统地影响面孔表情的情绪识别。一系列的研究对多通道效应的作用时间、潜在作用机制、相关激活脑区进行了探索。未来的研究可以整合脑网络的技术,并结合其他学科的新技术以更细致具体地考察这些通道下信息的物理属性所起的作用。 关键词 面孔表情 多通道效应 面孔情绪识别 肢体表情 情绪性声音 嗅觉信号  相似文献   

18.
Human interactions are replete with emotional exchanges. In these exchanges information about the emotional state of the interaction partners is only one type of information conveyed. In addition, emotion displays provide information about the interaction partners' disposition and the situation as such. That is, emotions serve as social signals. Acknowledging this role of emotions, this special section brings together research that illustrates how both person perception and situational understanding can be derived from emotional displays and the modulation of this process through context. Three contributions focus on information about expressers and their intentions. An additional article focuses on the informative value of emotional expressions for an observer's construal of social situations and another article exemplifies the way context determines the social impact of emotions. Finally, the last article presents the dynamic nature of mutual influence of emotions. In an attempt to integrate these contributions and offer lenses for future research, this editorial offers a contextualised model of social perception which attempts to systematise not only the types of information that emotion expressions can convey, but also to elaborate the notion of context.  相似文献   

19.
High levels of trait hostility are associated with wide-ranging interpersonal deficits and heightened physiological response to social stressors. These deficits may be attributable in part to individual differences in the perception of social cues. The present study evaluated the ability to recognize facial emotion among 48 high hostile (HH) and 48 low hostile (LH) smokers and whether experimentally-manipulated acute nicotine deprivation moderated relations between hostility and facial emotion recognition. A computer program presented series of pictures of faces that morphed from a neutral emotion into increasing intensities of happiness, sadness, fear, or anger, and participants were asked to identify the emotion displayed as quickly as possible. Results indicated that HH smokers, relative to LH smokers, required a significantly greater intensity of emotion expression to recognize happiness. No differences were found for other emotions across HH and LH individuals, nor did nicotine deprivation moderate relations between hostility and emotion recognition. This is the first study to show that HH individuals are slower to recognize happy facial expressions and that this occurs regardless of recent tobacco abstinence. Difficulty recognizing happiness in others may impact the degree to which HH individuals are able to identify social approach signals and to receive social reinforcement.  相似文献   

20.
Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号