首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
IntroductionPsychopaths with the dominant reduced interpersonal and affective ability are characterized by the hypofunction of the right hemisphere, while psychopaths with the dominant impulsivity and antisocial behavior are characterized by the hyperfunction of the left hemisphere. The assumption is that this interhemispheric imbalance in a psychopath will also be reflected in the recognition of facial emotional expressions.ObjectiveThe objective is to examine the lateralization of facial expressions of positive and negative emotions as well as processing of facial expressions of emotions in criminal and non-criminal psychopaths.Participants48 male participants age 24–40 were voluntarily recruited from the psychiatric hospital in Nis, Serbia.Stimuli48 black-and-white photographs in two separate tasks were used for the stimulation with central and lateral exposition.ResultsCriminality is related to the reduced recognition of facial expression of surprise and not necessarily to psychopathy, whereas reduced recognition of facial expression of fear is related to psychopathy, but not criminality. Valence-specific hypothesis has not been confirmed for positive and negative emotions in criminal and non-criminal psychopaths and non-psychopaths, but it was shown that positive emotions are equally well processed in both hemispheres, whereas negative emotions are more successfully processed in the left hemisphere.  相似文献   

2.
Very few large-scale studies have focused on emotional facial expression recognition (FER) in 3-year-olds, an age of rapid social and language development. We studied FER in 808 healthy 3-year-olds using verbal and nonverbal computerized tasks for four basic emotions (happiness, sadness, anger, and fear). Three-year-olds showed differential performance on the verbal and nonverbal FER tasks, especially with respect to fear. That is to say, fear was one of the most accurately recognized facial expressions as matched nonverbally and the least accurately recognized facial expression as labeled verbally. Sex did not influence emotion-matching nor emotion-labeling performance after adjusting for basic matching or labeling ability. Three-year-olds made systematic errors in emotion-labeling. Namely, happy expressions were often confused with fearful expressions, whereas negative expressions were often confused with other negative expressions. Together, these findings suggest that 3-year-olds' FER skills strongly depend on task specifications. Importantly, fear was the most sensitive facial expression in this regard. Finally, in line with previous studies, we found that recognized emotion categories are initially broad, including emotions of the same valence, as reflected in the nonrandom errors of 3-year-olds.  相似文献   

3.
ABSTRACT

Experiential avoidance refers to attempts to control or suppress unwanted thoughts, feelings and emotions. We investigated whether experiential avoidance is associated with fewer facial expressions during autobiographical retrieval (i.e. retrieval of memory for personal information). We invited participants to retrieve autobiographical memories, and recall was analysed by a facial analysis software that detects and classifies emotional expressions. Participants were divided into high vs. low experiential avoidance. Analysis showed fewer emotional facial expressions in participants with high experiential avoidance than in those with low experiential avoidance during autobiographical retrieval. This low emotional expression can be regarded as an attempt by individuals with high experiential avoidance to avoid communicating the emotional load to others. This low emotional expression can be also regarded as an attempt by individuals with high experiential avoidance to control or suppress the internal events that contribute to the appearance or persistence of unwanted emotional states during retrieval.  相似文献   

4.
The authors examined the association between psychopathy and identification of facial expressions of emotion. Previous research in this area is scant and has produced contradictory findings (Blair et. al., 2001, 2004; Glass & Newman, 2006; Kosson et al., 2002). One hundred and forty-five male jail inmates, rated using the Hare Psychopathy Checklist: Screening Version participated in a facial affect recognition task. Participants were shown faces containing one of five emotions (happiness, sadness, fear, anger, or shame) displayed at one of two different levels of intensity of expression (100% or 60%). The authors predicted that psychopathy would be associated with decreased affect recognition, particularly for sad and fearful emotional expressions, and decreased recognition of less intense displays of facial affect. Results were largely consistent with expectations in that psychopathy was negatively correlated with overall facial recognition of affect, sad facial affect, and recognition of less intense displays of affect. An unexpected negative correlation with recognition of happy facial affect was also found. These results suggest that psychopathy may be associated with a general deficit in affect recognition.  相似文献   

5.
Emotions can be assessed by means of different diagnostic methods, for example, by self-report instruments, ratings of facial expressions, or by projective techniques. This study presents an alternative approach: a computerized investigation of verbally expressed emotions by means of the Affective Dictionary Ulm (ADU; Dahl, H?lzer, & Berry, 1992), which was applied to responses in the Holtzman Inkblot Technique (HIT; Holtzman, 1988; Holtzman, Thorpe, Swartz, & Herron, 1961). A normal group (n = 30), patients with neurotic disorders (n = 30), borderline patients (n = 30), acute schizophrenics (n = 25), and chronic schizophrenics (n = 25) were compared in regard to verbally expressed emotions. According to the results, patients with neurotic disorders did not differ from the normal group in regard to verbally expressed emotions. Borderline patients expressed fear and emotions in general significantly more frequently than all other diagnostic groups. Furthermore, borderline patients differed in regard to specific emotions from patients with neurotic disorders, acute schizophrenics, and chronic schizophrenics. Acute schizophrenics did not differ from the normal group in regard to the expression of emotions, whereas chronic schizophrenics expressed anger, fear, anxiety, and emotions in general significantly less frequently than normals. By a discriminant analysis using verbally expressed emotions as predictors of the diagnosis, hit rates between 87% and 100% could be achieved. Furthermore, hypotheses about correlations between emotions on the one hand and internalized primitive object relations and bizarre-idiosyncratic thinking were tested empirically. Significant correlations could be demonstrated. These results support the validity of assessing emotions through a lexical content analysis of the HIT by use of the ADU.  相似文献   

6.
胡治国  刘宏艳 《心理科学》2015,(5):1087-1094
正确识别面部表情对成功的社会交往有重要意义。面部表情识别受到情绪背景的影响。本文首先介绍了情绪背景对面部表情识别的增强作用,主要表现为视觉通道的情绪一致性效应和跨通道情绪整合效应;然后介绍了情绪背景对面部表情识别的阻碍作用,主要表现为情绪冲突效应和语义阻碍效应;接着介绍了情绪背景对中性和歧义面孔识别的影响,主要表现为背景的情绪诱发效应和阈下情绪启动效应;最后对现有研究进行了总结分析,提出了未来研究的建议。  相似文献   

7.
Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.  相似文献   

8.
综述了近年来关于精神分裂症对情绪面部表情加工损伤的研究,讨论了这种损伤的性质,以及对这种损伤性质的解释,比如它属于一般性还是特异性的损伤,与临床症状以及认知特征之间的关系等。比较分析表明,精神分裂症情绪面部表情知觉损伤,可能兼有面部信息加工障碍和情绪信息知觉困难的特性。另外,介绍了国外关于针对精神分裂症面部表情再认和识别的康复训练研究以及近年来利用事件相关电位(ERPs)和功能磁共振成像(fMRI)等认知神经科学技术进行的神经生理机制研究  相似文献   

9.
To study different aspects of facial emotion recognition, valid methods are needed. The more widespread methods have some limitations. We propose a more ecological method that consists of presenting dynamic faces and measuring verbal reaction times. We presented 120 video clips depicting a gradual change from a neutral expression to a basic emotion (anger, disgust, fear, happiness, sadness and surprise), and recorded hit rates and reaction times of verbal labelling of emotions. Our results showed that verbal responses to six basic emotions differed in hit rates and reaction times: happiness > surprise > disgust > anger > sadness > fear (this means these emotional responses were more accurate and faster). Generally, our data are in accordance with previous findings, but our differentiation of responses is better than the data from previous experiments on six basic emotions.  相似文献   

10.
ABSTRACT

Previous research has found that individuals vary greatly in emotion differentiation, that is, the extent to which they distinguish between different emotions when reporting on their own feelings. Building on previous work that has shown that emotion differentiation is associated with individual differences in intrapersonal functions, the current study asks whether emotion differentiation is also related to interpersonal skills. Specifically, we examined whether individuals who are high in emotion differentiation would be more accurate in recognising others’ emotional expressions. We report two studies in which we used an established paradigm tapping negative emotion differentiation and several emotion recognition tasks. In Study 1 (N?=?363), we found that individuals high in emotion differentiation were more accurate in recognising others’ emotional facial expressions. Study 2 (N?=?217), replicated this finding using emotion recognition tasks with varying amounts of emotional information. These findings suggest that the knowledge we use to understand our own emotional experience also helps us understand the emotions of others.  相似文献   

11.
Recognizing emotion in faces: developmental effects of child abuse and neglect   总被引:12,自引:0,他引:12  
The contributions to the recognition of emotional signals of (a) experience and learning versus (b) internal predispositions are difficult to investigate because children are virtually always exposed to complex emotional experiences from birth. The recognition of emotion among physically abused and physically neglected preschoolers was assessed in order to examine the effects of atypical experience on emotional development. In Experiment 1, children matched a facial expression to an emotional situation. Neglected children had more difficulty discriminating emotional expressions than did control or physically abused children. Physically abused children displayed a response bias for angry facial expressions. In Experiment 2, children rated the similarity of facial expressions. Control children viewed discrete emotions as dissimilar, neglected children saw fewer distinctions between emotions, and physically abused children showed the most variance across emotions. These results suggest that to the extent that children's experience with the world varies, so too will their interpretation and understanding of emotional signals.  相似文献   

12.
Reading the non‐verbal cues from faces to infer the emotional states of others is central to our daily social interactions from very early in life. Despite the relatively well‐documented ontogeny of facial expression recognition in infancy, our understanding of the development of this critical social skill throughout childhood into adulthood remains limited. To this end, using a psychophysical approach we implemented the QUEST threshold‐seeking algorithm to parametrically manipulate the quantity of signals available in faces normalized for contrast and luminance displaying the six emotional expressions, plus neutral. We thus determined observers' perceptual thresholds for effective discrimination of each emotional expression from 5 years of age up to adulthood. Consistent with previous studies, happiness was most easily recognized with minimum signals (35% on average), whereas fear required the maximum signals (97% on average) across groups. Overall, recognition improved with age for all expressions except happiness and fear, for which all age groups including the youngest remained within the adult range. Uniquely, our findings characterize the recognition trajectories of the six basic emotions into three distinct groupings: expressions that show a steep improvement with age – disgust, neutral, and anger; expressions that show a more gradual improvement with age – sadness, surprise; and those that remain stable from early childhood – happiness and fear, indicating that the coding for these expressions is already mature by 5 years of age. Altogether, our data provide for the first time a fine‐grained mapping of the development of facial expression recognition. This approach significantly increases our understanding of the decoding of emotions across development and offers a novel tool to measure impairments for specific facial expressions in developmental clinical populations.  相似文献   

13.
Background: Difficulties with social function have been reported in chronic fatigue syndrome (CFS), but underpinning factors are unknown. Emotion recognition, theory of mind (inference of another's mental state) and ‘emotional’ theory of mind (eToM) (inference of another's emotional state) are important social abilities, facilitating understanding of others. This study examined emotion recognition and eToM in CFS patients and their relationship to self-reported social function.

Methods: CFS patients (n?=?45) and healthy controls (HCs; n?=?50) completed tasks assessing emotion recognition, basic or advanced eToM (for self and other) and a self-report measure of social function.

Results: CFS participants were poorer than HCs at recognising emotion states in the faces of others and at inferring their own emotions. Lower scores on these tasks were associated with poorer self-reported daily and social function. CFS patients demonstrated good eToM and performance on these tasks did not relate to the level of social function.

Conclusions: CFS patients do not have poor eToM, nor does eToM appear to be associated with social functioning in CFS. However, this group of patients experience difficulties in emotion recognition and inferring emotions in themselves and this may impact upon social function.  相似文献   

14.
The most familiar emotional signals consist of faces, voices, and whole-body expressions, but so far research on emotions expressed by the whole body is sparse. The authors investigated recognition of whole-body expressions of emotion in three experiments. In the first experiment, participants performed a body expression-matching task. Results indicate good recognition of all emotions, with fear being the hardest to recognize. In the second experiment, two alternative forced choice categorizations of the facial expression of a compound face-body stimulus were strongly influenced by the bodily expression. This effect was a function of the ambiguity of the facial expression. In the third experiment, recognition of emotional tone of voice was similarly influenced by task irrelevant emotional body expressions. Taken together, the findings illustrate the importance of emotional whole-body expressions in communication either when viewed on their own or, as is often the case in realistic circumstances, in combination with facial expressions and emotional voices.  相似文献   

15.
Psychological factors are known to play an important part in the origin of many medical conditions including hypertension. Recent studies have reported elevated blood pressure (even in the normal range of variation) to be associated with a reduced responsiveness to emotions or ‘emotional dampening’. Our aim was to assess emotional dampening in individuals with more extreme blood pressure levels including prehypertensives (N = 58) and hypertensives (N = 60) by comparing their emotion recognition ability with normotensives (N = 57). Participants completed novel facial emotion matching and facial emotion labelling tasks following blood pressure measurement and their accuracy of emotion recognition and average response times were compared. The normotensives demonstrated a significantly higher accuracy of emotion recognition than the prehypertensives and the hypertensives in labelling of facial emotions. This difference generalised to the task where two facial halves (upper & lower) had to be matched on the basis of emotions. In neither the labelling nor matching emotion conditions did the groups differ in their speed of emotion processing. Findings of the present study extend reports of ‘emotional dampening’ to hypertensives as well as to those at-risk for developing hypertension (i.e. prehypertensives) and have important implications for understanding the psychological component of such medical conditions as hypertension.  相似文献   

16.
17.
Background: Facial expressions, prosody, and speech content constitute channels by which information is exchanged. Little is known about the simultaneous and differential contribution of these channels to empathy when they provide emotionality or neutrality. Especially neutralised speech content has gained little attention with regards to influencing the perception of other emotional cues. Methods: Participants were presented with video clips of actors telling short-stories. One condition conveyed emotionality in all channels while the other conditions either provided neutral speech content, facial expression, or prosody, respectively. Participants judged the emotion and intensity presented, as well as their own emotional state and intensity. Skin conductance served as a physiological measure of emotional reactivity. Results: Neutralising channels significantly reduced empathic responses. Electrodermal recordings confirmed these findings. The differential effect of the communication channels on empathy prerequisites was that target emotion recognition of the other decreased mostly when the face was neutral, whereas decreased emotional responses attributed to the target emotion were especially present in neutral speech. Conclusion: Multichannel integration supports conscious and autonomous measures of empathy and emotional reactivity. Emotional facial expressions influence emotion recognition, whereas speech content is important for responding with an adequate own emotional state, possibly reflecting contextual emotion-appraisal.  相似文献   

18.
Amnesic patients can re-experience emotions elicited by forgotten events, suggesting that brain systems for episodic and emotional memory are independent. However, the range of such emotional memories remains under-investigated (most studies employing just positive–negative emotion dyads), and executive function may also play a role in the re-experience of emotions. This is the first investigation of the intensity of the emotional re-experience of a range of discrete emotions (anger, fear, sadness, and happiness) for a group of amnesic patients. Twenty Korsakoff syndrome (KS) patients and 20 neurologically normal controls listened to four novel emotional vignettes selectively eliciting the four basic emotions. Emotional experience was measured using pen-and-paper Visual Analogue Mood Scales and episodic memory using verbal recollections. After 30 min, the recollection of stories was severely impaired for the patient group, but the emotional re-experience was no different from that of controls. Notably, there was no relationship between episodic recall and the intensity of the four emotions, such that even profoundly amnesic patients reported moderate levels of the target emotion. Exploratory analyses revealed negative correlations between the intensity of basic emotions and executive functions (e.g., cognitive flexibility and response inhibition) for controls but not patients. The results suggest that discrete emotions can be re-experienced independently of episodic memory, and that the re-experience of certain discrete emotions appears to be dampened by executive control. KS patients with absent or mild cognitive symptoms should benefit from emotion-regulation interventions aimed at reducing the recognized affective burden associated with their episodic memory deficit.  相似文献   

19.
20.
Recognition of facial affect in Borderline Personality Disorder   总被引:1,自引:0,他引:1  
Patients with Borderline Personality Disorder (BPD) have been described as emotionally hyperresponsive, especially to anger and fear in social contexts. The aim was to investigate whether BPD patients are more sensitive but less accurate in terms of basic emotion recognition, and show a bias towards perceiving anger and fear when evaluating ambiguous facial expressions. Twenty-five women with BPD were compared with healthy controls on two different facial emotion recognition tasks. The first task allowed the assessment of the subjective detection threshold as well as the number of evaluation errors on six basic emotions. The second task assessed a response bias to blends of basic emotions. BPD patients showed no general deficit on the affect recognition task, but did show enhanced learning over the course of the experiment. For ambiguous emotional stimuli, we found a bias towards the perception of anger in the BPD patients but not towards fear. BPD patients are accurate in perceiving facial emotions, and are probably more sensitive to familiar facial expressions. They show a bias towards perceiving anger, when socio-affective cues are ambiguous. Interpersonal training should focus on the differentiation of ambiguous emotion in order to reduce a biased appraisal of others.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号