首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
It is well-known that patients having sustained frontal-lobe traumatic brain injury (TBI) are severely impaired on tests of emotion recognition. Indeed, these patients have significant difficulty recognizing facial expressions of emotion, and such deficits are often associated with decreased social functioning and poor quality of life. As of yet, no studies have examined the response patterns which underlie facial emotion recognition impairment in TBI and which may lend clarity to the interpretation of deficits. Therefore, the present study aimed to characterize response patterns in facial emotion recognition in 14 patients with frontal TBI compared to 22 matched control subjects, using a task which required participants to rate the intensity of each emotion (happiness, sadness, anger, disgust, surprise and fear) of a series of photographs of emotional and neutral faces. Results first confirmed the presence of facial emotion recognition impairment in TBI, and further revealed that patients displayed a liberal bias when rating facial expressions, leading them to associate intense ratings of incorrect emotional labels to sad, disgusted, surprised and fearful facial expressions. These findings are generally in line with prior studies which also report important facial affect recognition deficits in TBI patients, particularly for negative emotions.  相似文献   

2.
There is substantial evidence to suggest that deafness is associated with delays in emotion understanding, which has been attributed to delays in language acquisition and opportunities to converse. However, studies addressing the ability to recognise facial expressions of emotion have produced equivocal findings. The two experiments presented here attempt to clarify emotion recognition in deaf children by considering two aspects: the role of motion and the role of intensity in deaf children’s emotion recognition. In Study 1, 26 deaf children were compared to 26 age-matched hearing controls on a computerised facial emotion recognition task involving static and dynamic expressions of 6 emotions. Eighteen of the deaf and 18 age-matched hearing controls additionally took part in Study 2, involving the presentation of the same 6 emotions at varying intensities. Study 1 showed that deaf children’s emotion recognition was better in the dynamic rather than static condition, whereas the hearing children showed no difference in performance between the two conditions. In Study 2, the deaf children performed no differently from the hearing controls, showing improved recognition rates with increasing rates of intensity. With the exception of disgust, no differences in individual emotions were found. These findings highlight the importance of using ecologically valid stimuli to assess emotion recognition.  相似文献   

3.
The relationship between knowledge of American Sign Language (ASL) and the ability to encode facial expressions of emotion was explored. Participants were 55 college students, half of whom were intermediate-level students of ASL and half of whom had no experience with a signed language. In front of a video camera, participants posed the affective facial expressions of happiness, sadness, fear, surprise, anger, and disgust. These facial expressions were randomized onto stimulus tapes that were then shown to 60 untrained judges who tried to identify the expressed emotions. Results indicated that hearing subjects knowledgeable in ASL were generally more adept than were hearing nonsigners at conveying emotions through facial expression. Results have implications for better understanding the nature of nonverbal communication in hearing and deaf individuals.  相似文献   

4.
The aim of the present study was to examine social cognition and social functioning in a group of amnestic mild cognitive impairment (aMCI) and Alzheimer’s dementia (AD) patients. Thirty one people with aMCI, 29 individuals with AD, and 45 healthy older adults participated in the study. Facial expressions of happiness, anger, fear, disgust, and surprise presented in different intensities had to be labelled. Mentalizing was assessed using first-order belief theory of mind (ToM) stories and everyday social functioning by the Inventory of Interpersonal Situations (IIS), completed by an informant. aMCI patients were impaired in recognizing the emotions anger, disgust, and fear, while AD patients were impaired in recognizing the emotions anger, disgust, and surprise. More importantly, no significant differences between aMCI and AD patients were found on overall emotion recognition. Both the aMCI and AD patients were impaired on the ToM task, but no differences between the aMCI and AD patients were found. On everyday social functioning, only the AD patients showed impairments. No associations between the IIS and ToM were found, but the IIS and emotion perception were significantly correlated. Regression analysis taking all potentially confounding variables into account showed that only mood, but not the social-cognitive task performance or any other cognitive variable, predicted social functioning. aMCI and AD patients demonstrated impairments in mentalizing and facial emotion perception, and showed decrements in everyday social functioning. Informing caregivers about these deficits may help them to understand deficits in social cognition that may be present already in the MCI stage of Alzheimer’s disease.  相似文献   

5.
6.
7.
Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD). Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.  相似文献   

8.
A body of work has developed over the last 20 years that explores facial emotion perception in Borderline Personality Disorder (BPD). We identified 25 behavioural and functional imaging studies that tested facial emotion processing differences between patients with BPD and healthy controls through a database literature search. Despite methodological differences there is consistent evidence supporting a negative response bias to neutral and ambiguous facial expressions in patients. Findings for negative emotions are mixed with evidence from individual studies of an enhanced sensitivity to fearful expressions and impaired facial emotion recognition of disgust, while meta-analysis revealed no significant recognition impairments between BPD and healthy controls for any negative emotion. Mentalizing studies indicate that BPD patients are accurate at attributing mental states to complex social stimuli. Functional neuroimaging data suggest that the underlying neural substrate involves hyperactivation in the amygdala to affective facial stimuli, and altered activation in the anterior cingulate, inferior frontal gyrus and the superior temporal sulcus particularly during social emotion processing tasks. Future studies must address methodological inconsistencies, particularly variations in patients’ key clinical characteristics and in the testing paradigms deployed.  相似文献   

9.
Recognition of facial affect in Borderline Personality Disorder   总被引:1,自引:0,他引:1  
Patients with Borderline Personality Disorder (BPD) have been described as emotionally hyperresponsive, especially to anger and fear in social contexts. The aim was to investigate whether BPD patients are more sensitive but less accurate in terms of basic emotion recognition, and show a bias towards perceiving anger and fear when evaluating ambiguous facial expressions. Twenty-five women with BPD were compared with healthy controls on two different facial emotion recognition tasks. The first task allowed the assessment of the subjective detection threshold as well as the number of evaluation errors on six basic emotions. The second task assessed a response bias to blends of basic emotions. BPD patients showed no general deficit on the affect recognition task, but did show enhanced learning over the course of the experiment. For ambiguous emotional stimuli, we found a bias towards the perception of anger in the BPD patients but not towards fear. BPD patients are accurate in perceiving facial emotions, and are probably more sensitive to familiar facial expressions. They show a bias towards perceiving anger, when socio-affective cues are ambiguous. Interpersonal training should focus on the differentiation of ambiguous emotion in order to reduce a biased appraisal of others.  相似文献   

10.
Hosie  J. A.  Gray  C. D.  Russell  P. A.  Scott  C.  Hunter  N. 《Motivation and emotion》1998,22(4):293-313
This paper reports the results of three tasks comparing the development of the understanding of facial expressions of emotion in deaf and hearing children. Two groups of hearing and deaf children of elementary school age were tested for their ability to match photographs of facial expressions of emotion, and to produce and comprehend emotion labels for the expressions of happiness, sadness, anger, fear, disgust, and surprise. Accuracy data showed comparable levels of performance for deaf and hearing children of the same age. Happiness and sadness were the most accurately matched expressions and the most accurately produced and comprehended labels. Anger was the least accurately matched expression and the most poorly comprehended emotion label. Disgust was the least accurately labeled expression; however, deaf children were more accurate at labeling this expression, and also at labeling fear, than hearing children. Error data revealed that children confused anger with disgust, and fear with surprise. However, the younger groups of deaf and hearing children also showed a tendency to confuse the negative expressions of anger, disgust, and fear with sadness. The results suggest that, despite possible differences in the early socialisation of emotion, deaf and hearing children share a common understanding of the emotions conveyed by distinctive facial expressions.  相似文献   

11.
There is substantial evidence for facial emotion recognition (FER) deficits in autism spectrum disorder (ASD). The extent of this impairment, however, remains unclear, and there is some suggestion that clinical groups might benefit from the use of dynamic rather than static images. High-functioning individuals with ASD (n = 36) and typically developing controls (n = 36) completed a computerised FER task involving static and dynamic expressions of the six basic emotions. The ASD group showed poorer overall performance in identifying anger and disgust and were disadvantaged by dynamic (relative to static) stimuli when presented with sad expressions. Among both groups, however, dynamic stimuli appeared to improve recognition of anger. This research provides further evidence of specific impairment in the recognition of negative emotions in ASD, but argues against any broad advantages associated with the use of dynamic displays.  相似文献   

12.
The aim of this study was to investigate the causes of the own-race advantage in facial expression perception. In Experiment 1, we investigated Western Caucasian and Chinese participants’ perception and categorization of facial expressions of six basic emotions that included two pairs of confusable expressions (fear and surprise; anger and disgust). People were slightly better at identifying facial expressions posed by own-race members (mainly in anger and disgust). In Experiment 2, we asked whether the own-race advantage was due to differences in the holistic processing of facial expressions. Participants viewed composite faces in which the upper part of one expression was combined with the lower part of a different expression. The upper and lower parts of the composite faces were either aligned or misaligned. Both Chinese and Caucasian participants were better at identifying the facial expressions from the misaligned images, showing interference on recognizing the parts of the expressions created by holistic perception of the aligned composite images. However, this interference from holistic processing was equivalent across expressions of own-race and other-race faces in both groups of participants. Whilst the own-race advantage in recognizing facial expressions does seem to reflect the confusability of certain emotions, it cannot be explained by differences in holistic processing.  相似文献   

13.
Antisocial individuals have problems recognizing negative emotions (e.g. Marsh & Blair in Neuroscience and Biobehavioral Reviews 32:454–465, 2009); however, due to issues with sampling and different methods used, previous findings have been varied. Sixty-three male young offenders and 37 age-, IQ- and socio-economic status-matched male controls completed a facial emotion recognition task, which measures recognition of happiness, sadness, fear, anger, disgust, and surprise and neutral expressions across 4 emotional intensities. Conduct disorder (YSR), and psychopathic and callous/unemotional traits (YPI) were measured, and offenders’ offense data were taken from the Youth Offending Service’s case files. Relative to controls, offenders were significantly worse at identifying sadness, low intensity disgust and high intensity fear. A significant interaction for anger was also observed, with offenders showing reduced low- but increased high-intensity anger recognition in comparison with controls. Within the young offenders levels of conduct disorder and psychopathic traits explained variation in sadness and disgust recognition, whereas offense severity explained variation in anger recognition. These results suggest that antisocial youths show specific problems in recognizing negative emotions and support the use of targeted emotion recognition interventions for problematic behavior.  相似文献   

14.
Most previous research reporting emotion-recognition deficits in schizophrenia has used posed facial expressions of emotion and chronic-schizophrenia patients. In contrast, the present research examined the ability of patients with acute paranoid and nonparanoid (disorganized) schizophrenia to recognize genuine as well as posed facial expressions of emotion. Evidence of an emotion-recognition deficit in schizophrenia was replicated, but only when posed facial expressions were used. For genuine expressions of emotion, the paranoid-schizophrenia group was more accurate than controls, nonparanoid-schizophrenia patients, and depressed patients. Future research clearly needs to consider the posed versus genuine nature of the emotional stimuli used and the type of schizophrenia patients examined.  相似文献   

15.
It has been proposed that self-face representations are involved in interpreting facial emotions of others. We experimentally primed participants' self-face representations. In Study 1, we assessed eye tracking patterns and performance on a facial emotion discrimination task, and in Study 2, we assessed emotion ratings between self and nonself groups. Results show that experimental priming of self-face representations increases visual exploration of faces, facilitates the speed of facial expression processing, and increases the emotional distance between expressions. These findings suggest that the ability to interpret facial expressions of others is intimately associated with the representations we have of our own faces.  相似文献   

16.
Does our perception of others' emotional signals depend on the language we speak or is our perception the same regardless of language and culture? It is well established that human emotional facial expressions are perceived categorically by viewers, but whether this is driven by perceptual or linguistic mechanisms is debated. We report an investigation into the perception of emotional facial expressions, comparing German speakers to native speakers of Yucatec Maya, a language with no lexical labels that distinguish disgust from anger. In a free naming task, speakers of German, but not Yucatec Maya, made lexical distinctions between disgust and anger. However, in a delayed match-to-sample task, both groups perceived emotional facial expressions of these and other emotions categorically. The magnitude of this effect was equivalent across the language groups, as well as across emotion continua with and without lexical distinctions. Our results show that the perception of affective signals is not driven by lexical labels, instead lending support to accounts of emotions as a set of biologically evolved mechanisms.  相似文献   

17.
The ability to recognize and label emotional facial expressions is an important aspect of social cognition. However, existing paradigms to examine this ability present only static facial expressions, suffer from ceiling effects or have limited or no norms. A computerized test, the Emotion Recognition Task (ERT), was developed to overcome these difficulties. In this study, we examined the effects of age, sex, and intellectual ability on emotion perception using the ERT. In this test, emotional facial expressions are presented as morphs gradually expressing one of the six basic emotions from neutral to four levels of intensity (40%, 60%, 80%, and 100%). The task was administered in 373 healthy participants aged 8–75. In children aged 8–17, only small developmental effects were found for the emotions anger and happiness, in contrast to adults who showed age‐related decline on anger, fear, happiness, and sadness. Sex differences were present predominantly in the adult participants. IQ only minimally affected the perception of disgust in the children, while years of education were correlated with all emotions but surprise and disgust in the adult participants. A regression‐based approach was adopted to present age‐ and education‐ or IQ‐adjusted normative data for use in clinical practice. Previous studies using the ERT have demonstrated selective impairments on specific emotions in a variety of psychiatric, neurologic, or neurodegenerative patient groups, making the ERT a valuable addition to existing paradigms for the assessment of emotion perception.  相似文献   

18.
Some theories of emotion emphasise a close relationship between interoception and subjective experiences of emotion. In this study, we used facial expressions to examine whether interoceptive sensibility modulated emotional experience in a social context. Interoceptive sensibility was measured using the heartbeat detection task. To estimate individual emotional sensitivity, we made morphed photos that ranged between a neutral and an emotional facial expression (i.e., anger, sadness, disgust and happy). Recognition rates of particular emotions from these photos were calculated and considered as emotional sensitivity thresholds. Our results indicate that participants with accurate interoceptive awareness are sensitive to the emotions of others, especially for expressions of sadness and happy. We also found that false responses to sad faces were closely related with an individual's degree of social anxiety. These results suggest that interoceptive awareness modulates the intensity of the subjective experience of emotion and affects individual traits related to emotion processing.  相似文献   

19.
Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.  相似文献   

20.
Deficits in facial emotion recognition occur frequently after stroke, with adverse social and behavioural consequences. The aim of this study was to investigate the neural underpinnings of the recognition of emotional expressions, in particular of the distinct basic emotions (anger, disgust, fear, happiness, sadness and surprise). A group of 110 ischaemic stroke patients with lesions in (sub)cortical areas of the cerebrum was included. Emotion recognition was assessed with the Ekman 60 Faces Test of the FEEST. Patient data were compared to data of 162 matched healthy controls (HC’s). For the patients, whole brain voxel-based lesion–symptom mapping (VLSM) on 3-Tesla MRI images was performed. Results showed that patients performed significantly worse than HC’s on both overall recognition of emotions, and specifically of disgust, fear, sadness and surprise. VLSM showed significant lesion–symptom associations for FEEST total in the right fronto-temporal region. Additionally, VLSM for the distinct emotions showed, apart from overlapping brain regions (insula, putamen and Rolandic operculum), also regions related to specific emotions. These were: middle and superior temporal gyrus (anger); caudate nucleus (disgust); superior corona radiate white matter tract, superior longitudinal fasciculus and middle frontal gyrus (happiness) and inferior frontal gyrus (sadness). Our findings help in understanding how lesions in specific brain regions can selectively affect the recognition of the basic emotions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号