首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Deficits in facial emotion recognition occur frequently after stroke, with adverse social and behavioural consequences. The aim of this study was to investigate the neural underpinnings of the recognition of emotional expressions, in particular of the distinct basic emotions (anger, disgust, fear, happiness, sadness and surprise). A group of 110 ischaemic stroke patients with lesions in (sub)cortical areas of the cerebrum was included. Emotion recognition was assessed with the Ekman 60 Faces Test of the FEEST. Patient data were compared to data of 162 matched healthy controls (HC’s). For the patients, whole brain voxel-based lesion–symptom mapping (VLSM) on 3-Tesla MRI images was performed. Results showed that patients performed significantly worse than HC’s on both overall recognition of emotions, and specifically of disgust, fear, sadness and surprise. VLSM showed significant lesion–symptom associations for FEEST total in the right fronto-temporal region. Additionally, VLSM for the distinct emotions showed, apart from overlapping brain regions (insula, putamen and Rolandic operculum), also regions related to specific emotions. These were: middle and superior temporal gyrus (anger); caudate nucleus (disgust); superior corona radiate white matter tract, superior longitudinal fasciculus and middle frontal gyrus (happiness) and inferior frontal gyrus (sadness). Our findings help in understanding how lesions in specific brain regions can selectively affect the recognition of the basic emotions.  相似文献   

2.
There is substantial evidence for facial emotion recognition (FER) deficits in autism spectrum disorder (ASD). The extent of this impairment, however, remains unclear, and there is some suggestion that clinical groups might benefit from the use of dynamic rather than static images. High-functioning individuals with ASD (n = 36) and typically developing controls (n = 36) completed a computerised FER task involving static and dynamic expressions of the six basic emotions. The ASD group showed poorer overall performance in identifying anger and disgust and were disadvantaged by dynamic (relative to static) stimuli when presented with sad expressions. Among both groups, however, dynamic stimuli appeared to improve recognition of anger. This research provides further evidence of specific impairment in the recognition of negative emotions in ASD, but argues against any broad advantages associated with the use of dynamic displays.  相似文献   

3.
Influential models highlight the central integration of bodily arousal with emotion. Some emotions, notably disgust, are more closely coupled to visceral state than others. Cardiac baroreceptors, activated at systole within each cardiac cycle, provide short-term visceral feedback. Here we explored how phasic baroreceptor activation may alter the appraisal of brief emotional stimuli and consequent cardiovascular reactions. We used functional MRI (fMRI) to measure brain responses to emotional face stimuli presented before and during cardiac systole. We observed that the processing of emotional stimuli was altered by concurrent natural baroreceptor activation. Specifically, facial expressions of disgust were judged as more intense when presented at systole, and rebound heart rate increases were attenuated after expressions of disgust and happiness. Neural activity within prefrontal cortex correlated with emotionality ratings. Activity within periaqueductal gray matter reflected both emotional ratings and their interaction with cardiac timing. Activity within regions including prefrontal and visual cortices correlated with increases in heart rate evoked by the face stimuli, while orbitofrontal activity reflected both evoked heart rate change and its interaction with cardiac timing. Our findings demonstrate that momentary physiological fluctuations in cardiovascular afferent information (1) influence specific emotional judgments, mediated through regions including the periaqueductal gray matter, and (2) shape evoked autonomic responses through engagement of orbitofrontal cortex. Together these findings highlight the close coupling of visceral and emotional processes and identify neural regions mediating bodily state influences on affective judgment.  相似文献   

4.
Which brain regions are associated with recognition of emotional prosody? Are these distinct from those for recognition of facial expression? These issues were investigated by mapping the overlaps of co-registered lesions from 66 brain-damaged participants as a function of their performance in rating basic emotions. It was found that recognizing emotions from prosody draws on the right frontoparietal operculum, the bilateral frontal pole, and the left frontal operculum. Recognizing emotions from prosody and facial expressions draws on the right frontoparietal cortex, which may be important in reconstructing aspects of the emotion signaled by the stimulus. Furthermore, there were regions in the left and right temporal lobes that contributed disproportionately to recognition of emotion from faces or prosody, respectively.  相似文献   

5.
There is substantial evidence to suggest that deafness is associated with delays in emotion understanding, which has been attributed to delays in language acquisition and opportunities to converse. However, studies addressing the ability to recognise facial expressions of emotion have produced equivocal findings. The two experiments presented here attempt to clarify emotion recognition in deaf children by considering two aspects: the role of motion and the role of intensity in deaf children’s emotion recognition. In Study 1, 26 deaf children were compared to 26 age-matched hearing controls on a computerised facial emotion recognition task involving static and dynamic expressions of 6 emotions. Eighteen of the deaf and 18 age-matched hearing controls additionally took part in Study 2, involving the presentation of the same 6 emotions at varying intensities. Study 1 showed that deaf children’s emotion recognition was better in the dynamic rather than static condition, whereas the hearing children showed no difference in performance between the two conditions. In Study 2, the deaf children performed no differently from the hearing controls, showing improved recognition rates with increasing rates of intensity. With the exception of disgust, no differences in individual emotions were found. These findings highlight the importance of using ecologically valid stimuli to assess emotion recognition.  相似文献   

6.
Memory for actions is usually better following subject-performed tasks (SPT) than verbal tasks (VT). We hypothesised that enactment unitises the components of actions such that familiarity can support associative recognition following SPT. To examine this hypothesis, participants studied verb–object pairs in a SPT or VT condition. During testing, they discriminated between intact, recombined and new items and made Remember/Know judgments; additionally, their EEGs were recorded. Associative recognition was better following SPT than VT. Early frontal event-related potentials (ERPs) were graded according to the item status following SPT, but no such effects were found after VT. Similarly, the late parietal ERPs were graded following SPT, whereas these effects were smaller and did not differ between intact and recombined items following VT. We conclude that enactment unitised the action and object so that familiarity could contribute to associative recognition and that recollection became sensitive to the amount of the matching associative information.  相似文献   

7.
The aim of the present study was to contribute to the literature on the ability to recognize anger, happiness, fear, surprise, sadness, disgust, and neutral emotions from facial information (whole face, eye region, mouth region). More specifically, the aim was to investigate older adults' performance in emotions recognition using the same tool used in the previous studies on children and adults’ performance and verify if the pattern of emotions recognition show differences compared with the other two groups. Results showed that happiness is among the easiest emotions to recognize while the disgust is always among the most difficult emotions to recognize for older adults. The findings seem to indicate that is more easily recognizing emotions when pictures represent the whole face; compared with the specific region (eye and mouth regions), older participants seems to recognize more easily emotions when the mouth region is presented. In general, the results of the study did not detect a decay in the ability to recognize emotions from the face, eyes, or mouth. The performance of the old adults is statistically worse than the other two groups in only a few cases: in anger and disgust recognition from the whole face; in anger recognition from the eye region; and in disgust, fear, and neutral emotion recognition from mouth region.  相似文献   

8.
Essentially all behavior is regulated by the brain in response to information received from within the body or from the environment. The tangible structures of the brain serve as devices for processing thoughts and emotions as well as information. Stored among the interacting neural structures are memories of past experiences and responses to them. These intangibles participate in determining the decisions made and the actions performed by the brain's structures. There are valuable studies of the clinical and neurological effects of environmental stimuli, but we need to learn more about the processes that lead to these effects. More definitive correlations could be made between environmental stimuli and the neurological pathways they create by studying individual's real life experiences rather than laboratory simulations alone.  相似文献   

9.
To study different aspects of facial emotion recognition, valid methods are needed. The more widespread methods have some limitations. We propose a more ecological method that consists of presenting dynamic faces and measuring verbal reaction times. We presented 120 video clips depicting a gradual change from a neutral expression to a basic emotion (anger, disgust, fear, happiness, sadness and surprise), and recorded hit rates and reaction times of verbal labelling of emotions. Our results showed that verbal responses to six basic emotions differed in hit rates and reaction times: happiness > surprise > disgust > anger > sadness > fear (this means these emotional responses were more accurate and faster). Generally, our data are in accordance with previous findings, but our differentiation of responses is better than the data from previous experiments on six basic emotions.  相似文献   

10.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed.  相似文献   

11.
本研究利用事件相关电位技术结合内隐任务范式探究了三领域厌恶情绪的加工过程,同时测量了 个体的攻击水平,以期找到加工厌恶情绪的个体差异。在加工早期(130-190ms),生理厌恶刺激和道德厌恶刺激均可得到识别;在加工中期(300-350ms) ,三种厌恶刺激相互分离,三种厌恶情绪之间得到了区分;在加工晚期(400-600ms) ,个体则对性厌恶刺激最为敏感; 不同攻击水平个体在加工厌恶刺激的过程中并未表现出明显的个体差异。结果表明, 在神经层面上,三维结构下的厌恶情绪确实能够得到识别与区分。  相似文献   

12.
Essentially all behavior is regulated by the brain in response to information received from within the body or from the environment. The tangible structures of the brain serve as devices for processing thoughts and emotions as well as information. Stored among the interacting neural structures are memories of past experiences and responses to them. These intangibles participate in determining the decisions made and the actions performed by the brain’s structures. There are valuable studies of the clinical and neurological effects of environmental stimuli, but we need to learn more about the processes that lead to these effects. More definitive correlations could be made between environmental stimuli and the neurological pathways they create by studying individual’s real life experiences rather than laboratory simulations alone.  相似文献   

13.
In the present study, we used fMRI to assess patients suffering from post-traumatic stress disorder (PTSD) or depression, and trauma-exposed controls, during an episodic memory retrieval task that included non-trauma-related emotional information. In the study phase of the task neutral pictures were presented in emotional or neutral contexts. Participants were scanned during the test phase, when they were presented with old and new neutral images in a yes/no recognition memory task. fMRI results for the contrast between old and new items revealed activation in a predominantly left-sided network of cortical regions including the left middle temporal, bilateral posterior cingulate, and left prefrontal cortices. Activity common to all three groups when correctly judging pictures encoded in emotional contexts was much more limited. Relative to the control and depressed groups the PTSD group exhibited greater sensitivity to correctly recognised stimuli in the left amygdala/ventral striatum and right occipital cortex, and more specific sensitivity to items encoded in emotional contexts in the right precuneus, left superior frontal gyrus, and bilateral insula. These results are consistent with a substantially intact neural system supporting episodic retrieval in patients suffering from PTSD. Moreover, there was little indication that PTSD is associated with a marked change in the way negatively valenced information, not of personal significance, is processed.  相似文献   

14.
Research on emotion recognition has been dominated by studies of photographs of facial expressions. A full understanding of emotion perception and its neural substrate will require investigations that employ dynamic displays and means of expression other than the face. Our aims were: (i) to develop a set of dynamic and static whole-body expressions of basic emotions for systematic investigations of clinical populations, and for use in functional-imaging studies; (ii) to assess forced-choice emotion-classification performance with these stimuli relative to the results of previous studies; and (iii) to test the hypotheses that more exaggerated whole-body movements would produce (a) more accurate emotion classification and (b) higher ratings of emotional intensity. Ten actors portrayed 5 emotions (anger, disgust, fear, happiness, and sadness) at 3 levels of exaggeration, with their faces covered. Two identical sets of 150 emotion portrayals (full-light and point-light) were created from the same digital footage, along with corresponding static images of the 'peak' of each emotion portrayal. Recognition tasks confirmed previous findings that basic emotions are readily identifiable from body movements, even when static form information is minimised by use of point-light displays, and that full-light and even point-light displays can convey identifiable emotions, though rather less efficiently than dynamic displays. Recognition success differed for individual emotions, corroborating earlier results about the importance of distinguishing differences in movement characteristics for different emotional expressions. The patterns of misclassifications were in keeping with earlier findings on emotional clustering. Exaggeration of body movement (a) enhanced recognition accuracy, especially for the dynamic point-light displays, but notably not for sadness, and (b) produced higher emotional-intensity ratings, regardless of lighting condition, for movies but to a lesser extent for stills, indicating that intensity judgments of body gestures rely more on movement (or form-from-movement) than static form information.  相似文献   

15.
The processing of several important aspects of a human face was investigated in a single patient (LZ), who had a large infarct of the right hemisphere involving the parietal, and temporal lobes with extensions into the frontal region. LZ showed selective problems with recognizing emotional expressions, whereas she was flawless in recognizing gender, familiarity, and identity. She was very poor in recognizing negative facial expressions (fear, disgust, anger, sadness), but scored as well as the controls on the positive facial expression of happiness. However, in two experiments using both static and dynamic face stimuli, we showed that LZ also did not have a proper notion of what a facial expression of happiness looks like, and could not adequately apply this label. We conclude that the proper recognition of both negative and positive facial expressions relies on the right hemisphere, and that the left hemisphere produces a default state resulting in a bias towards evaluating expressions as happy. We discuss the implications of the current findings for the main models that aim to explain hemispheric specializations for processing of positive and negative emotions.  相似文献   

16.
The purpose of this study was to compare the recognition performance of children who identified facial expressions of emotions using adults' and children's stimuli. The subjects were 60 children equally distributed in six subgroups as a function of sex and three age levels: 5, 7, and 9 years. They had to identify the emotion that was expressed in 48 stimuli (24 adults' and 24 children's expressions) illustrating six emotions: happiness, surprise, fear, disgust, anger, and sadness. The task of the children consisted of selecting the facial stimulus that best matched a short story that clearly described an emotional situation. The results indicated that recognition performances were significantly affected by the age of the subjects: 5-year-olds were less accurate than 7- and 9-year-olds who did not differ among themselves. There were also differences in recognition levels between emotions. No effects related to the sex of the subjects and to the age of the facial stimuli were observed.  相似文献   

17.
Subthalamic nucleus (STN) deep brain stimulation (DBS) has recently advanced our understanding of the major role played by this basal ganglion in human emotion. Research indicates that STN DBS can induce modifications in all components of emotion, and neuroimaging studies have shown that the metabolic modifications correlated with these emotional disturbances following surgery are both task‐ and sensory input‐dependent. Nevertheless, to date, these modifications have not been confirmed for all emotional components, notably subjective emotional experience, or feelings. To identify the neural network underlying the modification of feelings following STN DBS, we assessed 16 patients with Parkinson's disease before and after surgery, using both subjective assessments of emotional experience and 18[F]fluorodeoxyglucose positron emission tomography (18FDG‐PET). The patients viewed six film excerpts intended to elicit happy, angry, fearful, sad, disgusted, and neutral feelings, and they self‐rated the intensity of these feelings. After DBS, there was a significant reduction in the intensity of the disgust feeling. Correlations were observed between decreased disgust experience and cerebral glucose metabolism (FDG uptake) in the bilateral pre‐frontal cortices (orbitofrontal, dorsolateral, and inferior frontal gyri), bilateral insula, and right cerebellum. We suggest that the STN contributes to the synchronization process underlying the emergence of feelings.  相似文献   

18.
In order to investigate the role of facial movement in the recognition of emotions, faces were covered with black makeup and white spots. Video recordings of such faces were played back so that only the white spots were visible. The results demonstrated that moving displays of happiness, sadness, fear, surprise, anger and disgust were recognized more accurately than static displays of the white spots at the apex of the expressions. This indicated that facial motion, in the absence of information about the shape and position of facial features, is informative about these basic emotions. Normally illuminated dynamic displays of these expressions, however, were recognized more accurately than displays of moving spots. The relative effectiveness of upper and lower facial areas for the recognition of these six emotions was also investigated using normally illuminated and spots-only displays. In both instances the results indicated that different facial regions are more informative for different emitions. The movement patterns characterizing the various emotional expressions as well as common confusions between emotions are also discussed.  相似文献   

19.
Previous studies have shown inconsistent findings regarding the contribution of the different prefrontal regions in emotion recognition. Moreover, the hemispheric lateralization hypothesis posits that the right hemisphere is dominant for processing all emotions regardless of affective valence, whereas the valence specificity hypothesis posits that the left hemisphere is specialized for processing positive emotions while the right hemisphere is specialized for negative emotions. However, recent findings suggest that the evidence for such lateralization has been less consistent. In this study, we investigated emotion recognition of fear, surprise, happiness, sadness, disgust, and anger in 30 patients with focal prefrontal cortex lesions and 30 control subjects. We also examined the impact of lesion laterality on recognition of the six basic emotions. The results showed that compared to control subjects, the frontal subgroups were impaired in recognition of three negative basic emotions of fear, sadness, and anger – regardless of the lesion laterality. Therefore, our findings did not establish that each hemisphere is specialized for processing specific emotions. Moreover, the voxel-based lesion symptom mapping analysis showed that recognition of fear, sadness, and anger draws on a partially common bilaterally distributed prefrontal network.  相似文献   

20.
厌恶是人和动物最基本的情绪之一, 起源于口腔对苦味(有毒)物质的排斥, 常伴有恶心呕吐和远离诱发刺激的强烈愿望, 具有回避潜在疾病威胁的功能。大量动物和人类研究表明, 催产素、孕激素和雌激素不同程度地影响核心厌恶刺激的感知、核心厌恶情绪的产生与表达、条件性厌恶习得和厌恶表情识别。三种激素主要通过作用于五羟色胺、γ-氨基丁酸、乙酰胆碱和谷氨酸等神经递质受体, 调节杏仁核、脑岛、前扣带回、壳核、梨状皮层、额中回等脑区活动, 影响厌恶加工。未来研究应当在准确测量激素水平和控制实验任务难度的基础上, 探究各激素对不同感觉通道厌恶加工的影响, 及其性别的调节作用; 同时结合脑成像技术和动物行为学, 明确各激素影响厌恶加工的神经内分泌机制。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号