首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
By the age of 4 years, children (N=120) know the meaning of the word disgust as well as they know the meaning of anger and fear; for example, when asked, they are equally able to generate a plausible cause for each of these emotions. Yet, in tasks involving facial expressions (free labelling of faces, deciding whether or not a face expresses disgust, or finding a “disgust face” in an array of faces), a majority of 3- to 7-year-old children (N=144) associated the prototypical “disgust face” with anger and denied its association with disgust (25% of adults on the same tasks did so as well). These results challenge the assumption that all humans easily recognise disgust from its facial expression and that this recognition is a precursor to children's understanding of the emotion of disgust.  相似文献   

2.
This study reanalyzes American and Japanese multiscalar ratings of universal facial expressions originally collected by Matsumoto (1986), of which only single emotion scales were analyzed and reported by Matsumoto and Ekman (1989). The nonanalysis of the entire data set ignored basic and important questions about the nature of judgments of universal facial expressions of emotion. These were addressed in this study. We found that (1) observers in both cultures perceived multiple emotions in universal facial expressions, not just one; (2) cultural differences occurred on multiple emotion scales for each expression, not just the target scale; (3) the directions of those differences differed according to the rating scale used and the expression being observed; and (4) no underlying dimension was evidenced that would account for these differences. These findings raise new questions about the nature of the judgment process and the role of judgment studies in supporting the universality thesis, the bases of which need to be explored in future research and incorporated in future theories of emotion and universality.  相似文献   

3.
The common within-subjects design of studies on the recognition of emotion from facial expressions allows the judgement of one face to be influenced by previous faces, thus introducing the potential for artefacts. The present study (N=344) showed that the canonical “disgust face” was judged as disgusted, provided that the preceding set of faces included “anger expressions”, but was judged as angry when the preceding set of faces excluded anger but instead included persons who looked sad or about to be sick. Chinese observers showed lower recognition of the “disgust face” than did American observers. Chinese observers also showed lower recognition of the “fear face” when responding in Chinese than in English.  相似文献   

4.
North American (Canadian) and Indian observers were shown photographs of six facial emotions; happiness, sadness, fear, anger, surprise, and disgust, expressed by American Caucasian and Indian subjects. Observers were asked to judge each photograph, on a 7-point scale, for the degree of (a) distinctiveness (free from blending with other emotion categories), (b) pleasantness-unpleasantness, and (c) arousal-nonarousal of expressed facial emotion. The results showed significant interaction of Observer × Expressor × Emotion for the distinctiveness judgement. It was found that fearful and angry expressions in Indian faces, in comparison to Caucasian faces, were judged as less distinctly identifiable by observers of both cultural origins. Indian observers rated these two emotion expressions as being more distinctive than did North Americans irrespective of the culture of the expressor. In addition, Indian observers judged fearful and angry expressions as more unpleasant than did North Americans. Caucasians, in comparison to Indians, were judged to have more arousal in most of the emotion expressions.  相似文献   

5.
Shame, embarrassment, compassion, and contempt have been considered candidates for the status of basic emotions on the grounds that each has a recognisable facial expression. In two studies (N=88, N=60) on recognition of these four facial expressions, observers showed moderate agreement on the predicted emotion when assessed with forced choice (58%; 42%), but low agreement when assessed with free labelling (18%; 16%). Thus, even though some observers endorsed the predicted emotion when it was presented in a list, over 80% spontaneously interpreted these faces in a way other than the predicted emotion.  相似文献   

6.
Children acquire emotion categories gradually   总被引:2,自引:0,他引:2  
Some accounts imply that basic-level emotion categories are acquired early and quickly, whereas others imply that they are acquired later and more gradually. Our study examined this question for fear, happiness, sadness, and anger in the context of children's categorization of emotional facial expressions. Children (N = 168, 2–5 years) first labeled facial expressions of six emotions and were then shown a box and asked to put all and only, e.g., scared people in it. Before using fear in labeling, children had begun to include ‘fear’ faces and to exclude other (especially positive) faces from the fear box/category; after using fear, children continued to include other (especially negative) faces. The same pattern was observed for happiness, sadness, and anger. Emotion categories begin broad, including all emotions/faces of the same valence, and then gradually narrow over the preschool years.  相似文献   

7.
This investigation examined whether impairment in configural processing could explain deficits in face emotion recognition in people with Parkinson’s disease (PD). Stimuli from the Radboud Faces Database were used to compare recognition of four negative emotion expressions by older adults with PD (n = 16) and matched controls (n = 17). Participants were tasked with categorizing emotional expressions from upright and inverted whole faces and facial composites; it is difficult to derive configural information from these two types of stimuli so featural processing should play a larger than usual role in accurate recognition of emotional expressions. We found that the PD group were impaired relative to controls in recognizing anger, disgust and fearful expressions in upright faces. Then, consistent with a configural processing deficit, participants with PD showed no composite effect when attempting to identify facial expressions of anger, disgust and fear. A face inversion effect, however, was observed in the performance of all participants in both the whole faces and facial composites tasks. These findings can be explained in terms of a configural processing deficit if it is assumed that the disruption caused by facial composites was specific to configural processing, whereas inversion reduced performance by making it difficult to derive both featural and configural information from faces.  相似文献   

8.
Studies of adults with depression point to characteristic neurocognitive deficits, including differences in processing facial expressions. Few studies have examined face processing in juvenile depression, or taken account of other comorbid disorders. Three groups were compared: depressed children and adolescents with conduct disorder (n = 23), depressed children and adolescents without conduct disorder (n = 29) and children and adolescents without disorder (n = 37). A novel face emotion processing experiment presented faces with ‘happy’, ‘sad’, ‘angry’, or ‘fearful’ expressions of varying emotional intensity using morphed stimuli. Those with depression showed no overall or specific deficits in facial expression recognition accuracy. Instead, they showed biases affecting processing of low-intensity expressions, more often perceiving these as sad. In contrast, non-depressed controls more often misperceived low intensity negative emotions as happy. There were no differences between depressed children and adolescents with and without conduct disorder, or between children with comorbid depression/conduct disorder and controls. Face emotion processing biases rather than deficits appear to distinguish depressed from non-depressed children and adolescents.  相似文献   

9.
Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.  相似文献   

10.
Few studies have examined potential differences between social anxiety disorder (SAD) and generalised anxiety disorder (GAD) in the sensitivity to detect emotional expressions. The present study aims to compare the detection of emotional expressions in SAD and GAD. Participants with a primary diagnosis of GAD (n?=?46), SAD (n?=?70), and controls (n?=?118) completed a morph movies task. The task presented faces expressing increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expressions. Participants used a slide bar to view the movie frames from left to right, and to stop at the first frame where they perceived an emotion. The frame selected thus indicated the intensity of emotion required to identify the facial expression. Participants with GAD detected the onset of facial emotions at lower intensity of emotion than participants with SAD (p?=?0.002) and controls (p?=?0.039). In a multiple regression analysis controlling for age, race, and depressive symptom severity, lower frame at which the emotion was detected was independently associated and GAD diagnosis (B?=?–5.73, SE?=?1.74, p?相似文献   

11.
Following Yik and Russell (1999) a judgement paradigm was used to examine to what extent differential accuracy of recognition of facial expressions allows evaluation of the well-foundedness of different theoretical views on emotional expression. Observers judged photos showing facial expressions of seven emotions on the basis of: (1) discrete emotion categories; (2) social message types; (3) appraisal results; or (4) action tendencies, and rated their confidence in making choices. Emotion categories and appraisals were judged significantly more accurately and confidently than messages or action tendencies. These results do not support claims of primacy for message or action tendency views of facial expression. Based on a componential model of emotion it is suggested that judges can infer components from categories and vice versa.  相似文献   

12.
为探寻自闭症儿童在识别低强度(10%,30%)、中强度(40%,60%)和高强度(70%,90%)的愤怒和开心面部表情时,识别情绪类型的既有能力和差异。采用表情标签范式,用E-prime软件在电脑上呈现不同强度的3D合成面部表情刺激,分别对10名自闭症儿童、10名正常发育儿童和10名智障儿童进行了实验研究。结果发现,自闭症儿童在低强度表情时具有面部表情识别障碍,其对不同强度面部表情识别正确率显著低于智障儿童和正常发育儿童;自闭症儿童面部表情识别正确率与面部表情强度呈正相关,面部表情强度越大,自闭症儿童面部表情识别的正确率越高;自闭症儿童对低强度面部表情识别时,对开心表情的识别正确率高于愤怒表情,但是,在中强度和高强度面部表情识别时,存在显著的愤怒优势效应。  相似文献   

13.
The present study investigated the potential protective role of components of emotion knowledge (i.e., emotion recognition, situation knowledge) in the links between young children's shyness and indices of socio‐emotional functioning. Participants were = 163 children (82 boys and 81 girls) aged 23–77 months (= 53.29, SD = 14.48), recruited from preschools in Italy. Parents provided ratings of child shyness and teachers rated children's socio‐emotional functioning at preschool (i.e., social competence, anxiety‐withdrawal, peer rejection). Children were also interviewed to assess their abilities to recognize facial emotional expressions and identify situations that affect emotions. Among the results, shyness was positively related to anxiety‐withdrawal and peer rejection. In addition, emotion recognition was found to significantly moderate the links between shyness and preschool socio‐emotional functioning, appearing to serve a buffering role. For example, at lower levels of emotion recognition, shyness was positively associated with both anxiety‐withdrawal and rejection by peers, but at higher levels of emotion recognition, these associations were attenuated. Results are discussed in terms of the protective role of emotion recognition in promoting shy children's positive socio‐emotional functioning within the classroom context.  相似文献   

14.
The six basic emotions (disgust, anger, fear, happiness, sadness, and surprise) have long been considered discrete categories that serve as the primary units of the emotion system. Yet recent evidence indicated underlying connections among them. Here we tested the underlying relationships among the six basic emotions using a perceptual learning procedure. This technique has the potential of causally changing participants’ emotion detection ability. We found that training on detecting a facial expression improved the performance not only on the trained expression but also on other expressions. Such a transfer effect was consistently demonstrated between disgust and anger detection as well as between fear and surprise detection in two experiments (Experiment 1A, n?=?70; Experiment 1B, n?=?42). Notably, training on any of the six emotions could improve happiness detection, while sadness detection could only be improved by training on sadness itself, suggesting the uniqueness of happiness and sadness. In an emotion recognition test using a large sample of Chinese participants (n?=?1748), the confusion between disgust and anger as well as between fear and surprise was further confirmed. Taken together, our study demonstrates that the “basic” emotions share some common psychological components, which might be the more basic units of the emotion system.  相似文献   

15.
Contradicting evidence exists regarding the link between loneliness and sensitivity to facial cues of emotion, as loneliness has been related to better but also to worse performance on facial emotion recognition tasks. This study aims to contribute to this debate and extends previous work by (a) focusing on both accuracy and sensitivity to detecting positive and negative expressions, (b) controlling for depressive symptoms and social anxiety, and (c) using an advanced emotion recognition task with videos of neutral adolescent faces gradually morphing into full-intensity expressions. Participants were 170 adolescents (49% boys; Mage?=?13.65 years) from rural, low-income schools. Results showed that loneliness was associated with increased sensitivity to happy, sad, and fear faces. When controlling for depressive symptoms and social anxiety, loneliness remained significantly associated with sensitivity to sad and fear faces. Together, these results suggest that lonely adolescents are vigilant to negative facial cues of emotion.  相似文献   

16.
Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.  相似文献   

17.
Facial stimuli are widely used in behavioural and brain science research to investigate emotional facial processing. However, some studies have demonstrated that dynamic expressions elicit stronger emotional responses compared to static images. To address the need for more ecologically valid and powerful facial emotional stimuli, we created Dynamic FACES, a database of morphed videos (n?=?1026) from younger, middle-aged, and older adults displaying naturalistic emotional facial expressions (neutrality, sadness, disgust, fear, anger, happiness). To assess adult age differences in emotion identification of dynamic stimuli and to provide normative ratings for this modified set of stimuli, healthy adults (n?=?1822, age range 18–86 years) categorised for each video the emotional expression displayed, rated the expression distinctiveness, estimated the age of the face model, and rated the naturalness of the expression. We found few age differences in emotion identification when using dynamic stimuli. Only for angry faces did older adults show lower levels of identification accuracy than younger adults. Further, older adults outperformed middle-aged adults’ in identification of sadness. The use of dynamic facial emotional stimuli has previously been limited, but Dynamic FACES provides a large database of high-resolution naturalistic, dynamic expressions across adulthood. Information on using Dynamic FACES for research purposes can be found at http://faces.mpib-berlin.mpg.de.  相似文献   

18.
The current longitudinal study (N = 107) examined mothers’ facial emotion recognition using reaction time and their infants’ affect-based attention at 5, 7, and 14 months of age using eyetracking. Our results, examining maternal and infant responses to angry, fearful and happy facial expressions, show that only maternal responses to angry facial expressions were robustly and positively linked across time points, indexing a consistent trait-like response to social threat among mothers. However, neither maternal responses to happy or fearful facial expressions nor infant responses to all three facial emotions show such consistency, pointing to the changeable nature of facial emotion processing, especially among infants. In general, infants’ attention toward negative emotions (i.e., angry and fear) at earlier timepoints was linked to their affect-biased attention for these emotions at 14 months but showed greater dynamic change across time. Moreover, our results provide limited evidence for developmental continuity in processing negative emotions and for the bidirectional interplay of infant affect-biased attention and maternal facial emotion recognition. This pattern of findings suggests that infants’ affect-biased attention to facial expressions of emotion are characterized by dynamic changes.  相似文献   

19.
There is substantial evidence for facial emotion recognition (FER) deficits in autism spectrum disorder (ASD). The extent of this impairment, however, remains unclear, and there is some suggestion that clinical groups might benefit from the use of dynamic rather than static images. High-functioning individuals with ASD (n = 36) and typically developing controls (n = 36) completed a computerised FER task involving static and dynamic expressions of the six basic emotions. The ASD group showed poorer overall performance in identifying anger and disgust and were disadvantaged by dynamic (relative to static) stimuli when presented with sad expressions. Among both groups, however, dynamic stimuli appeared to improve recognition of anger. This research provides further evidence of specific impairment in the recognition of negative emotions in ASD, but argues against any broad advantages associated with the use of dynamic displays.  相似文献   

20.
Adults perceive emotional expressions categorically, with discrimination being faster and more accurate between expressions from different emotion categories (i.e. blends with two different predominant emotions) than between two stimuli from the same category (i.e. blends with the same predominant emotion). The current study sought to test whether facial expressions of happiness and fear are perceived categorically by pre-verbal infants, using a new stimulus set that was shown to yield categorical perception in adult observers (Experiments 1 and 2). These stimuli were then used with 7-month-old infants (N = 34) using a habituation and visual preference paradigm (Experiment 3). Infants were first habituated to an expression of one emotion, then presented with the same expression paired with a novel expression either from the same emotion category or from a different emotion category. After habituation to fear, infants displayed a novelty preference for pairs of between-category expressions, but not within-category ones, showing categorical perception. However, infants showed no novelty preference when they were habituated to happiness. Our findings provide evidence for categorical perception of emotional expressions in pre-verbal infants, while the asymmetrical effect challenges the notion of a bias towards negative information in this age group.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号