首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 140 毫秒
1.
2.
Research on emotion recognition has been dominated by studies of photographs of facial expressions. A full understanding of emotion perception and its neural substrate will require investigations that employ dynamic displays and means of expression other than the face. Our aims were: (i) to develop a set of dynamic and static whole-body expressions of basic emotions for systematic investigations of clinical populations, and for use in functional-imaging studies; (ii) to assess forced-choice emotion-classification performance with these stimuli relative to the results of previous studies; and (iii) to test the hypotheses that more exaggerated whole-body movements would produce (a) more accurate emotion classification and (b) higher ratings of emotional intensity. Ten actors portrayed 5 emotions (anger, disgust, fear, happiness, and sadness) at 3 levels of exaggeration, with their faces covered. Two identical sets of 150 emotion portrayals (full-light and point-light) were created from the same digital footage, along with corresponding static images of the 'peak' of each emotion portrayal. Recognition tasks confirmed previous findings that basic emotions are readily identifiable from body movements, even when static form information is minimised by use of point-light displays, and that full-light and even point-light displays can convey identifiable emotions, though rather less efficiently than dynamic displays. Recognition success differed for individual emotions, corroborating earlier results about the importance of distinguishing differences in movement characteristics for different emotional expressions. The patterns of misclassifications were in keeping with earlier findings on emotional clustering. Exaggeration of body movement (a) enhanced recognition accuracy, especially for the dynamic point-light displays, but notably not for sadness, and (b) produced higher emotional-intensity ratings, regardless of lighting condition, for movies but to a lesser extent for stills, indicating that intensity judgments of body gestures rely more on movement (or form-from-movement) than static form information.  相似文献   

3.
Several studies have reported impairment in the recognition of facial expressions of disgust in patients with Huntington's disease (HD) and preclinical carriers of the HD gene. The aim of this study was to establish whether impairment for disgust in HD patients extended to include the ability to express the emotion on their own faces. Eleven patients with HD, and 11 age and education matched healthy controls participated in three tasks concerned with the expression of emotions. One task assessed the spontaneous production of disgust-like facial expressions during the smelling of offensive odorants. A second assessed the production of posed facial expressions during deliberate attempts to communicate emotion. The third task evaluated HD patients’ ability to imitate the specific facial configurations associated with each emotion. Foul odours induced fewer disgust-like facial reactions in HD patients than in controls, and patients’ posed facial expressions of disgust were less accurate than the posed disgust expressions of controls. The effect was selective to disgust; patients had no difficulty posing expressions of other emotions. These impairments were not explained by compromised muscle control: HD patients had no difficulty imitating the facial movements required to display disgust. Viewed together with evidence of difficulty in other aspects of disgust in HD, the findings suggest that a common substrate might participate in both the processing and the expression of this emotion.  相似文献   

4.
Subjects were presented with videotaped expressions of 10 classic Hindu emotions. The 10 emotions were (in rough translation from Sanskrit) anger, disgust, fear, heroism, humor-amusement, love, peace, sadness, shame-embarrassment, and wonder. These emotions (except for shame) and their portrayal were described about 2,000 years ago in the Natyasastra, and are enacted in the contemporary Hindu classical dance. The expressions are dynamic and include both the face and the body, especially the hands. Three different expressive versions of each emotion were presented, along with 15 neutral expressions. American and Indian college students responded to each of these 45 expressions using either a fixed-response format (10 emotion names and "neutral/no emotion") or a totally free response format. Participants from both countries were quite accurate in identifying emotions correctly using both fixed-choice (65% correct, expected value of 9%) and free-response (61% correct, expected value close to zero) methods.  相似文献   

5.
By the age of 4 years, children (N=120) know the meaning of the word disgust as well as they know the meaning of anger and fear; for example, when asked, they are equally able to generate a plausible cause for each of these emotions. Yet, in tasks involving facial expressions (free labelling of faces, deciding whether or not a face expresses disgust, or finding a “disgust face” in an array of faces), a majority of 3- to 7-year-old children (N=144) associated the prototypical “disgust face” with anger and denied its association with disgust (25% of adults on the same tasks did so as well). These results challenge the assumption that all humans easily recognise disgust from its facial expression and that this recognition is a precursor to children's understanding of the emotion of disgust.  相似文献   

6.
Do people always interpret a facial expression as communicating a single emotion (e.g., the anger face as only angry) or is that interpretation malleable? The current study investigated preschoolers' (N = 60; 3-4 years) and adults' (N = 20) categorization of facial expressions. On each of five trials, participants selected from an array of 10 facial expressions (an open-mouthed, high arousal expression and a closed-mouthed, low arousal expression each for happiness, sadness, anger, fear, and disgust) all those that displayed the target emotion. Children's interpretation of facial expressions was malleable: 48% of children who selected the fear, anger, sadness, and disgust faces for the "correct" category also selected these same faces for another emotion category; 47% of adults did so for the sadness and disgust faces. The emotion children and adults attribute to facial expressions is influenced by the emotion category for which they are looking. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

7.
The common within-subjects design of studies on the recognition of emotion from facial expressions allows the judgement of one face to be influenced by previous faces, thus introducing the potential for artefacts. The present study (N=344) showed that the canonical “disgust face” was judged as disgusted, provided that the preceding set of faces included “anger expressions”, but was judged as angry when the preceding set of faces excluded anger but instead included persons who looked sad or about to be sick. Chinese observers showed lower recognition of the “disgust face” than did American observers. Chinese observers also showed lower recognition of the “fear face” when responding in Chinese than in English.  相似文献   

8.
The effects of Asian and Caucasian facial morphology were examined by having Canadian children categorize pictures of facial expressions of basic emotions. The pictures were selected from the Japanese and Caucasian Facial Expressions of Emotion set developed by D. Matsumoto and P. Ekman (1989). Sixty children between the ages of 5 and 10 years were presented with short stories and an array of facial expressions, and were asked to point to the expression that best depicted the specific emotion experienced by the characters. The results indicated that expressions of fear and surprise were better categorized from Asian faces, whereas expressions of disgust were better categorized from Caucasian faces. These differences originated in some specific confusions between expressions.  相似文献   

9.
The aim of the present study was to contribute to the literature on the ability to recognize anger, happiness, fear, surprise, sadness, disgust, and neutral emotions from facial information (whole face, eye region, mouth region). More specifically, the aim was to investigate older adults' performance in emotions recognition using the same tool used in the previous studies on children and adults’ performance and verify if the pattern of emotions recognition show differences compared with the other two groups. Results showed that happiness is among the easiest emotions to recognize while the disgust is always among the most difficult emotions to recognize for older adults. The findings seem to indicate that is more easily recognizing emotions when pictures represent the whole face; compared with the specific region (eye and mouth regions), older participants seems to recognize more easily emotions when the mouth region is presented. In general, the results of the study did not detect a decay in the ability to recognize emotions from the face, eyes, or mouth. The performance of the old adults is statistically worse than the other two groups in only a few cases: in anger and disgust recognition from the whole face; in anger recognition from the eye region; and in disgust, fear, and neutral emotion recognition from mouth region.  相似文献   

10.
The effects of Asian and Caucasian facial morphology were examined by having Canadian children categorize pictures of facial expressions of basic emotions. The pictures were selected from the Japanese and Caucasian Facial Expressions of Emotion set developed by D. Matsumoto and P. Ekman (1989). Sixty children between the ages of 5 and 10 years were presented with short stories and an array of facial expressions, and were asked to point to the expression that best depicted the specific emotion experienced by the characters. The results indicated that expressions of fear and surprise were better categorized from Asian faces, whereas expressions of disgust were better categorized from Caucasian faces. These differences originated in some specific confusions between expressions.  相似文献   

11.
In order to investigate the role of facial movement in the recognition of emotions, faces were covered with black makeup and white spots. Video recordings of such faces were played back so that only the white spots were visible. The results demonstrated that moving displays of happiness, sadness, fear, surprise, anger and disgust were recognized more accurately than static displays of the white spots at the apex of the expressions. This indicated that facial motion, in the absence of information about the shape and position of facial features, is informative about these basic emotions. Normally illuminated dynamic displays of these expressions, however, were recognized more accurately than displays of moving spots. The relative effectiveness of upper and lower facial areas for the recognition of these six emotions was also investigated using normally illuminated and spots-only displays. In both instances the results indicated that different facial regions are more informative for different emitions. The movement patterns characterizing the various emotional expressions as well as common confusions between emotions are also discussed.  相似文献   

12.
Some theories of emotion emphasise a close relationship between interoception and subjective experiences of emotion. In this study, we used facial expressions to examine whether interoceptive sensibility modulated emotional experience in a social context. Interoceptive sensibility was measured using the heartbeat detection task. To estimate individual emotional sensitivity, we made morphed photos that ranged between a neutral and an emotional facial expression (i.e., anger, sadness, disgust and happy). Recognition rates of particular emotions from these photos were calculated and considered as emotional sensitivity thresholds. Our results indicate that participants with accurate interoceptive awareness are sensitive to the emotions of others, especially for expressions of sadness and happy. We also found that false responses to sad faces were closely related with an individual's degree of social anxiety. These results suggest that interoceptive awareness modulates the intensity of the subjective experience of emotion and affects individual traits related to emotion processing.  相似文献   

13.
ABSTRACT

Objective: The ability to perceive facial emotion varies with age. Relative to younger adults (YA), older adults (OA) are less accurate at identifying fear, anger, and sadness, and more accurate at identifying disgust. Because different emotions are conveyed by different parts of the face, changes in visual scanning patterns may account for age-related variability. We investigated the relation between scanning patterns and recognition of facial emotions. Additionally, as frontal-lobe changes with age may affect scanning patterns and emotion recognition, we examined correlations between scanning parameters and performance on executive function tests. Methods: We recorded eye movements from 16 OA (mean age 68.9) and 16 YA (mean age 19.2) while they categorized facial expressions and non-face control images (landscapes), and administered standard tests of executive function. Results: OA were less accurate than YA at identifying fear (p < .05, r = .44) and more accurate at identifying disgust (p < .05, r = .39). OA fixated less than YA on the top half of the face for disgust, fearful, happy, neutral, and sad faces (p values < .05, r values ≥ .38), whereas there was no group difference for landscapes. For OA, executive function was correlated with recognition of sad expressions and with scanning patterns for fearful, sad, and surprised expressions. Conclusion: We report significant age-related differences in visual scanning that are specific to faces. The observed relation between scanning patterns and executive function supports the hypothesis that frontal-lobe changes with age may underlie some changes in emotion recognition.  相似文献   

14.
The authors investigated children's ability to recognize emotions from the information available in the lower, middle, or upper face. School-age children were shown partial or complete facial expressions and asked to say whether they corresponded to a given emotion (anger, fear, surprise, or disgust). The results indicate that 5-year-olds were able to recognize fear, anger, and surprise from partial facial expressions. Fear was better recognized from the information located in the upper face than those located in the lower face. A similar pattern of results was found for anger, but only in girls. Recognition improved between 5 and 10 years old for surprise and anger, but not for fear and disgust.  相似文献   

15.
16.
The present study investigated whether facial expressions of emotion are recognized holistically, i.e., all at once as an entire unit, as faces are or featurally as other nonface stimuli. Evidence for holistic processing of faces comes from a reliable decrement in recognition performance when faces are presented inverted rather than upright. If emotion is recognized holistically, then recognition of facial expressions of emotion should be impaired by inversion. To test this, participants were shown schematic drawings of faces showing one of six emotions (surprise, sadness, anger, happiness, disgust, and fear) in either an upright or inverted orientation and were asked to indicate the emotion depicted. Participants were more accurate in the upright than in the inverted orientation, providing evidence in support of holistic recognition of facial emotion. Because recognition of facial expressions of emotion is important in social relationships, this research may have implications for treatment of some social disorders.  相似文献   

17.
Previous choice reaction time studies have provided consistent evidence for faster recognition of positive (e.g., happy) than negative (e.g., disgusted) facial expressions. A predominance of positive emotions in normal contexts may partly explain this effect. The present study used pleasant and unpleasant odors to test whether emotional context affects the happy face advantage. Results from 2 experiments indicated that happiness was recognized faster than disgust in a pleasant context, but this advantage disappeared in an unpleasant context because of the slow recognition of happy faces. Odors may modulate the functioning of those emotion-related brain structures that participate in the formation of the perceptual representations of the facial expressions and in the generation of the conceptual knowledge associated with the signaled emotion.  相似文献   

18.
Research on disgust in neuroscience, medicine, and psychology often relies on a disgust facial expression from a standardized set. Two studies (N = 60 and N = 160) compared this standard disgust face to a new facial expression called the “sick face” posed by three different actors asked to look as if they were sick and about to vomit. Relative to the standard disgust face, the sick face was significantly more likely to be endorsed as disgust, less likely to be endorsed as another emotion, and rated as conveying disgust more intensely. Disgust may not have a facial signal, but various faces may serve as cues to disgust.  相似文献   

19.
Faces are widely used as stimuli in various research fields. Interest in emotion-related differences and age-associated changes in the processing of faces is growing. With the aim of systematically varying both expression and age of the face, we created FACES, a database comprising N=171 naturalistic faces of young, middle-aged, and older women and men. Each face is represented with two sets of six facial expressions (neutrality, sadness, disgust, fear, anger, and happiness), resulting in 2,052 individual images. A total of N=154 young, middleaged, and older women and men rated the faces in terms of facial expression and perceived age. With its large age range of faces displaying different expressions, FACES is well suited for investigating developmental and other research questions on emotion, motivation, and cognition, as well as their interactions. Information on using FACES for research purposes can be found at http://faces.mpib-berlin.mpg.de.  相似文献   

20.
While there is an extensive literature on the tendency to mimic emotional expressions in adults, it is unclear how this skill emerges and develops over time. Specifically, it is unclear whether infants mimic discrete emotion-related facial actions, whether their facial displays are moderated by contextual cues and whether infants’ emotional mimicry is constrained by developmental changes in the ability to discriminate emotions. We therefore investigate these questions using Baby-FACS to code infants’ facial displays and eye-movement tracking to examine infants’ looking times at facial expressions. Three-, 7-, and 12-month-old participants were exposed to dynamic facial expressions (joy, anger, fear, disgust, sadness) of a virtual model which either looked at the infant or had an averted gaze. Infants did not match emotion-specific facial actions shown by the model, but they produced valence-congruent facial responses to the distinct expressions. Furthermore, only the 7- and 12-month-olds displayed negative responses to the model’s negative expressions and they looked more at areas of the face recruiting facial actions involved in specific expressions. Our results suggest that valence-congruent expressions emerge in infancy during a period where the decoding of facial expressions becomes increasingly sensitive to the social signal value of emotions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号