首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 234 毫秒
1.
Recognition of facial affect in Borderline Personality Disorder   总被引:1,自引:0,他引:1  
Patients with Borderline Personality Disorder (BPD) have been described as emotionally hyperresponsive, especially to anger and fear in social contexts. The aim was to investigate whether BPD patients are more sensitive but less accurate in terms of basic emotion recognition, and show a bias towards perceiving anger and fear when evaluating ambiguous facial expressions. Twenty-five women with BPD were compared with healthy controls on two different facial emotion recognition tasks. The first task allowed the assessment of the subjective detection threshold as well as the number of evaluation errors on six basic emotions. The second task assessed a response bias to blends of basic emotions. BPD patients showed no general deficit on the affect recognition task, but did show enhanced learning over the course of the experiment. For ambiguous emotional stimuli, we found a bias towards the perception of anger in the BPD patients but not towards fear. BPD patients are accurate in perceiving facial emotions, and are probably more sensitive to familiar facial expressions. They show a bias towards perceiving anger, when socio-affective cues are ambiguous. Interpersonal training should focus on the differentiation of ambiguous emotion in order to reduce a biased appraisal of others.  相似文献   

2.
A growing body of evidence from humans and other animals suggests the amygdala may be a critical neural substrate for emotional processing. In particular, recent studies have shown that damage to the human amygdala impairs the normal appraisal of social signals of emotion, primarily those of fear. However, effective social communication depends on both the ability to receive (emotional appraisal) and the ability to send (emotional expression) signals of emotional state. Although the role of the amygdala in the appraisal of emotion is well established, its importance for the production of emotional expressions is unknown. We report a case study of a patient with bilateral amygdaloid damage who, despite a severe deficit in interpreting facial expressions of emotion including fear, exhibits an intact ability to express this and other basic emotions. This dissociation suggests that a single neural module does not support all aspects of the social communication of emotional state.  相似文献   

3.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness–fear and anger–disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness–fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise–sadness and excitement–disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

4.
Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.  相似文献   

5.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed.  相似文献   

6.
This study was designed to test three competing hypotheses (impaired configural processing; impaired Theory of Mind; atypical amygdala functioning) to explain the basic facial expression recognition profile of adults with autism spectrum disorders (ASD). In Experiment 1 the Ekman and Friesen (1976) series were presented upright and inverted. Individuals with ASD were significantly less accurate than controls at recognising upright facial expressions of fear, sadness and disgust and their pattern of errors suggested some configural processing difficulties. Impaired recognition of inverted facial expressions suggested some additional difficulties processing the facial features. Unexpectedly, the clinical group misidentified fear as anger. In Experiment 2 feature processing of facial expressions was investigated by presenting stimuli in a piecemeal fashion, starting with either just the eyes or the mouth. Individuals with ASD were impaired at recognising fear from the eyes and disgust from the mouth; they also confused fearful eyes as being angry. The findings are discussed in terms of the three competing hypotheses tested.  相似文献   

7.
Arousal and valence have long been studied as the two primary dimensions for the perception of emotional stimuli such as facial expressions. Prior correlational studies that tested emotion perception along these dimensions found broad similarities between adults and children. However, few studies looked for direct differences between children and adults in these dimensions beyond correlation. We tested 9-year-old children and adults on rating positive and negative facial stimuli based on emotional arousal and valence. Despite high significant correlations between children’s and adults’ ratings, our findings also showed significant differences between children and adults in terms of rating values: Children rated all expressions as significantly more positive than adults in valence. Children also rated positive emotions as more arousing than adults. Our results show that although perception of facial emotions along arousal and valence follows similar patterns in children and adults, some differences in ratings persist, and vary by emotion type.  相似文献   

8.
基本情绪理论(basic emotion theory)是情绪科学领域最具代表性的理论, 该理论认为人类情绪是由有限的几种基本情绪组成的, 如恐惧、愤怒、喜悦、悲伤等。基本情绪是为了完成基本生命任务(fundamental life task)进化而来的, 每一种基本情绪都有独特的神经结构和生理基础。尽管基本情绪理论被广泛接受, 但是对于基本情绪的种类却莫衷一是。近几十年来, 许多fMRI研究试图确定各种基本情绪的独特神经结构基础, 而且取得了许多重要发现, 比如厌恶和脑岛有关, 悲伤和前扣带回有关, 杏仁核是与恐惧有关的重要边缘结构等。但是, 最近有人进行了元分析研究, 发现许多基本情绪存在混淆的大脑区域, 因此对基本情绪的特定脑区理论提出质疑, 甚至否定基本情绪理论。通过对基本情绪及其神经基础的探讨, 以及对基本情绪理论的最新功能性磁共振成像研究进行梳理分析, 提出有关基本情绪理论的争论来源于基本情绪种类的确定, 因为许多所谓的不同基本情绪实际上是同一种基本情绪, 提出人类可能只有3种基本情绪。未来研究可以利用机器视觉技术进一步推动基本情绪脑影像研究。  相似文献   

9.
Detection of emotional facial expressions has been shown to be more efficient than detection of neutral expressions. However, it remains unclear whether this effect is attributable to visual or emotional factors. To investigate this issue, we conducted two experiments using the visual search paradigm with photographic stimuli. We included a single target facial expression of anger or happiness in presentations of crowds of neutral facial expressions. The anti-expressions of anger and happiness were also presented. Although anti-expressions produced changes in visual features comparable to those of the emotional facial expressions, they expressed relatively neutral emotions. The results consistently showed that reaction times (RTs) for detecting emotional facial expressions (both anger and happiness) were shorter than those for detecting anti-expressions. The RTs for detecting the expressions were negatively related to experienced emotional arousal. These results suggest that efficient detection of emotional facial expressions is not attributable to their visual characteristics but rather to their emotional significance.  相似文献   

10.
Very few large-scale studies have focused on emotional facial expression recognition (FER) in 3-year-olds, an age of rapid social and language development. We studied FER in 808 healthy 3-year-olds using verbal and nonverbal computerized tasks for four basic emotions (happiness, sadness, anger, and fear). Three-year-olds showed differential performance on the verbal and nonverbal FER tasks, especially with respect to fear. That is to say, fear was one of the most accurately recognized facial expressions as matched nonverbally and the least accurately recognized facial expression as labeled verbally. Sex did not influence emotion-matching nor emotion-labeling performance after adjusting for basic matching or labeling ability. Three-year-olds made systematic errors in emotion-labeling. Namely, happy expressions were often confused with fearful expressions, whereas negative expressions were often confused with other negative expressions. Together, these findings suggest that 3-year-olds' FER skills strongly depend on task specifications. Importantly, fear was the most sensitive facial expression in this regard. Finally, in line with previous studies, we found that recognized emotion categories are initially broad, including emotions of the same valence, as reflected in the nonrandom errors of 3-year-olds.  相似文献   

11.
The purpose of this study was to compare the recognition performance of children who identified facial expressions of emotions using adults' and children's stimuli. The subjects were 60 children equally distributed in six subgroups as a function of sex and three age levels: 5, 7, and 9 years. They had to identify the emotion that was expressed in 48 stimuli (24 adults' and 24 children's expressions) illustrating six emotions: happiness, surprise, fear, disgust, anger, and sadness. The task of the children consisted of selecting the facial stimulus that best matched a short story that clearly described an emotional situation. The results indicated that recognition performances were significantly affected by the age of the subjects: 5-year-olds were less accurate than 7- and 9-year-olds who did not differ among themselves. There were also differences in recognition levels between emotions. No effects related to the sex of the subjects and to the age of the facial stimuli were observed.  相似文献   

12.
Facial expressions of anger and fear have been seen to elicit avoidance behavior in the perceiver due to their negative valence. However, recent research uncovered discrepancies regarding these immediate motivational implications of fear and anger, suggesting that not all negative emotions trigger avoidance to a comparable extent. To clarify those discrepancies, we considered recent theoretical and methodological advances, and investigated the role of social preferences and processing focus on approach-avoidance tendencies (AAT) to negative facial expressions. We exposed participants to dynamic facial expressions of anger, disgust, fear, or sadness, while they processed either the emotional expression or the gender of the faces. AATs were assessed by reaction times of lever movements, and by posture changes via head-tracking. We found that—relative to angry faces-, fearful and sad faces triggered more approach, with a larger difference between fear and anger in prosocial compared to individualistic participants. Interestingly, these findings are in line with a recently developed concern hypothesis, suggesting that—relative to other negative expressions—expressions of distress may facilitate approach, especially in participants with prosocial preferences.  相似文献   

13.
Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions.  相似文献   

14.
Emotion can be conceptualized by the dimensional account of emotion with the dimensions of valence and arousal. There is little discussion of the difference in discriminability across the dimensions. The present study hypothesized that any pair of emotional expressions differing in the polarity of both valence and arousal dimensions would be easier to distinguish than a pair differing in only one dimension. The results indicate that the difference in the dimensions did not affect participants’ reaction time. Most pairs of emotional expressions, except those involving fear, were similarly discriminative. Reaction times to pairs with a fearful expression were faster than to those without. The fast reaction time to fearful facial expressions underscores the survival value of emotions.  相似文献   

15.
Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication.  相似文献   

16.
The effects of focal brain lesions on the decoding of emotional concepts in facial expressions were investigated. Facial emotions are hierarchically organized patterns comprising (1) structural surface features, (2) discrete (primary) emotional categories and (3) secondary dimensions, such as valence and arousal.Categoricaldecoding was measured using (1) selection of category labels and selection of the named emotion category; (2) matching one facial expression with two choice expressions.Dimensionaldecoding was assessed by matching one face with two different expressions with regard to valence or arousal. 70 patients with well documented cerebral lesions and 15 matched hospital controls participated in the study. 27 had left brain damage (LBD; 10 frontal, 10 temporal, 7 parietal); 37 had right brain damage (RBD; 15 frontal, 11 temporal, 11 parietal). Six additional patients had lesions involving both frontal lobes. Right temporal and parietal lesioned patients were markedly impaired in the decoding of primary emotions. The same patients also showed a reduced arousal decoding. In contrast to several patients with frontal and left hemisphere lesions, emotional conceptualization and face discrimination was not independent in these groups. No group differences were observed in valence decoding. However, right frontal lesions appeared to interfere with the discrimination of negative valence. Moreover, a distraction by structural features was noted in RBD when facial identities were varied across stimulus and response pictures in matching tasks with differing conceptual load. Our results suggest that focal brain lesions differentially affect the comprehension of emotional meaning in faces depending on the level of conceptual load and interference of structural surface features.  相似文献   

17.
Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD). Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.  相似文献   

18.
This investigation examined whether impairment in configural processing could explain deficits in face emotion recognition in people with Parkinson’s disease (PD). Stimuli from the Radboud Faces Database were used to compare recognition of four negative emotion expressions by older adults with PD (n = 16) and matched controls (n = 17). Participants were tasked with categorizing emotional expressions from upright and inverted whole faces and facial composites; it is difficult to derive configural information from these two types of stimuli so featural processing should play a larger than usual role in accurate recognition of emotional expressions. We found that the PD group were impaired relative to controls in recognizing anger, disgust and fearful expressions in upright faces. Then, consistent with a configural processing deficit, participants with PD showed no composite effect when attempting to identify facial expressions of anger, disgust and fear. A face inversion effect, however, was observed in the performance of all participants in both the whole faces and facial composites tasks. These findings can be explained in terms of a configural processing deficit if it is assumed that the disruption caused by facial composites was specific to configural processing, whereas inversion reduced performance by making it difficult to derive both featural and configural information from faces.  相似文献   

19.
It is well-known that patients having sustained frontal-lobe traumatic brain injury (TBI) are severely impaired on tests of emotion recognition. Indeed, these patients have significant difficulty recognizing facial expressions of emotion, and such deficits are often associated with decreased social functioning and poor quality of life. As of yet, no studies have examined the response patterns which underlie facial emotion recognition impairment in TBI and which may lend clarity to the interpretation of deficits. Therefore, the present study aimed to characterize response patterns in facial emotion recognition in 14 patients with frontal TBI compared to 22 matched control subjects, using a task which required participants to rate the intensity of each emotion (happiness, sadness, anger, disgust, surprise and fear) of a series of photographs of emotional and neutral faces. Results first confirmed the presence of facial emotion recognition impairment in TBI, and further revealed that patients displayed a liberal bias when rating facial expressions, leading them to associate intense ratings of incorrect emotional labels to sad, disgusted, surprised and fearful facial expressions. These findings are generally in line with prior studies which also report important facial affect recognition deficits in TBI patients, particularly for negative emotions.  相似文献   

20.
In order to investigate the role of facial movement in the recognition of emotions, faces were covered with black makeup and white spots. Video recordings of such faces were played back so that only the white spots were visible. The results demonstrated that moving displays of happiness, sadness, fear, surprise, anger and disgust were recognized more accurately than static displays of the white spots at the apex of the expressions. This indicated that facial motion, in the absence of information about the shape and position of facial features, is informative about these basic emotions. Normally illuminated dynamic displays of these expressions, however, were recognized more accurately than displays of moving spots. The relative effectiveness of upper and lower facial areas for the recognition of these six emotions was also investigated using normally illuminated and spots-only displays. In both instances the results indicated that different facial regions are more informative for different emitions. The movement patterns characterizing the various emotional expressions as well as common confusions between emotions are also discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号