首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Arousal and valence have long been studied as the two primary dimensions for the perception of emotional stimuli such as facial expressions. Prior correlational studies that tested emotion perception along these dimensions found broad similarities between adults and children. However, few studies looked for direct differences between children and adults in these dimensions beyond correlation. We tested 9-year-old children and adults on rating positive and negative facial stimuli based on emotional arousal and valence. Despite high significant correlations between children’s and adults’ ratings, our findings also showed significant differences between children and adults in terms of rating values: Children rated all expressions as significantly more positive than adults in valence. Children also rated positive emotions as more arousing than adults. Our results show that although perception of facial emotions along arousal and valence follows similar patterns in children and adults, some differences in ratings persist, and vary by emotion type.  相似文献   

2.
Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser’s internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with “emotionless” computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.  相似文献   

3.
Findings in the neuroimaging literature suggest that separate brain circuitries are involved when individuals perform emotional compared to nonemotional working memory (WM) tasks. Here we test this hypothesis with behavioural measures. We predicted that the conceptual processing of affect would be disrupted more by concurrent affective than nonaffective load. Participants performed a conceptual task in which they verified affective versus sensory properties of concepts, and a second, concurrent, working memory (n-back) task in which the target stimuli were facial expressions. Results revealed that storing and updating affective (as compared with identity) features of facial expressions altered performance more for affective than for sensory properties of concepts. The findings are supportive of the ideas that affective resources exist and that these resources are specifically used during the processing and representation of affective properties of objects and events.  相似文献   

4.
Based on research linking depressive symptoms and intimate partner aggression perpetration with negatively biased perception of social stimuli, the present authors examined biased perception of emotional expressions as a mechanism in the frequently observed relationship between depression and psychological aggression perpetration. In all, 30 university students made valence ratings (negative to positive) of emotional facial expressions and completed measures of depressive symptoms and psychological aggression perpetration. As expected, depressive symptoms were positively associated with psychological aggression perpetration in an individual's current relationship, and this relationship was mediated by ratings of negative emotional expressions. These findings suggest that negatively biased perception of emotional expressions within the context of elevated depressive symptoms may represent an early stage of information processing that leads to aggressive relationship behaviors.  相似文献   

5.
This investigation examined whether impairment in configural processing could explain deficits in face emotion recognition in people with Parkinson’s disease (PD). Stimuli from the Radboud Faces Database were used to compare recognition of four negative emotion expressions by older adults with PD (n = 16) and matched controls (n = 17). Participants were tasked with categorizing emotional expressions from upright and inverted whole faces and facial composites; it is difficult to derive configural information from these two types of stimuli so featural processing should play a larger than usual role in accurate recognition of emotional expressions. We found that the PD group were impaired relative to controls in recognizing anger, disgust and fearful expressions in upright faces. Then, consistent with a configural processing deficit, participants with PD showed no composite effect when attempting to identify facial expressions of anger, disgust and fear. A face inversion effect, however, was observed in the performance of all participants in both the whole faces and facial composites tasks. These findings can be explained in terms of a configural processing deficit if it is assumed that the disruption caused by facial composites was specific to configural processing, whereas inversion reduced performance by making it difficult to derive both featural and configural information from faces.  相似文献   

6.
以往研究发现眼睛注视方向知觉受面孔表情的影响,愤怒面孔相较于恐惧面孔更倾向被判断为看着观察者。虽然研究者对此提出了不同的解释,但目前尚不清楚愤怒和恐惧表情在注视方向知觉中的这种差异影响到底来自于面孔的结构信息还是物理特征信息。本研究采用注视方向辨别任务,计算直视知觉范围(The Cone of Direct Gaze,CoDG)为因变量,分别以直立,倒置及模糊图片为实验材料,试图通过分离面孔结构信息和物理特征信息,对以上问题进行探讨。结果发现在保留面孔全部信息的情况下(实验1)愤怒面孔的CoDG大于恐惧面孔;在破坏结构信息加工,只保留特征信息加工的情况下(实验2))愤怒和恐惧表情在直视知觉范围上的差异消失了;在削弱物理特征信息加工,保留结构信息加工的情况下(实验3)二者在CoDG上的差异又复现。本研究结果说明不同威胁性面孔表情对眼睛注视知觉的影响主要来自于二者在与情绪意义相关的结构信息加工上的不同,而二者非低级的物理信息上的差异,支持信号共享假说和情绪评价假说对威胁性面孔表情与注视方向整合加工解释的理论基础。  相似文献   

7.
This study investigated whether observers' facial reactions to the emotional facial expressions of others represent an affective or a cognitive response to these emotional expressions. Three hypotheses were contrasted: (1) facial reactions to emotional facial expressions are due to mimicry as part of an affective empathic reaction; (2) facial reactions to emotional facial expressions are a reflection of shared affect due to emotion induction; and (3) facial reactions to emotional facial expressions are determined by cognitive load depending on task difficulty. Two experiments were conducted varying type of task, presentation of stimuli, and task difficulty. The results show that depending on the nature of the rating task, facial reactions to facial expressions may be either affective or cognitive. Specifically, evidence for facial mimicry was only found when individuals made judgements regarding the valence of an emotional facial expression. Other types of judgements regarding facial expressions did not seem to elicit mimicry but may lead to facial responses related to cognitive load.  相似文献   

8.
Previous work has shown that patients with depersonalization disorder (DPD) have reduced physiological responses to emotional stimuli, which may be related to subjective emotional numbing. This study investigated two aspects of affective processing in 13 patients with DPD according to the DSM‐IV criteria and healthy controls: the perception of emotional facial expressions (anger, disgust, fear, happiness, sadness, and surprise) and memory for emotional stimuli. Results revealed a specific lack of sensitivity to facial expression of anger in patients, but normal enhancement of memory for peripheral aspects of arousing emotional material. The results are consistent with altered processing of threat‐related stimuli but intact consolidation processes, at least when the stimuli involved are potently arousing.  相似文献   

9.
ABSTRACT— Current theories of emotion perception posit that basic facial expressions signal categorically discrete emotions or affective dimensions of valence and arousal. In both cases, the information is thought to be directly "read out" from the face in a way that is largely immune to context. In contrast, the three studies reported here demonstrated that identical facial configurations convey strikingly different emotions and dimensional values depending on the affective context in which they are embedded. This effect is modulated by the similarity between the target facial expression and the facial expression typically associated with the context. Moreover, by monitoring eye movements, we demonstrated that characteristic fixation patterns previously thought to be determined solely by the facial expression are systematically modulated by emotional context already at very early stages of visual processing, even by the first time the face is fixated. Our results indicate that the perception of basic facial expressions is not context invariant and can be categorically altered by context at early perceptual levels.  相似文献   

10.
We examined the effects of emotional bodily expressions on the perception of time. Participants were shown bodily expressions of fear, happiness and sadness in a temporal bisection task featuring different stimulus duration ranges. Stimulus durations were judged to be longer for bodily expressions of fear than for those of sadness, whereas no significant difference was observed between sad and happy postures. In addition, the magnitude of the lengthening effect of fearful versus sad postures increased with duration range. These results suggest that the perception of fearful bodily expressions increases the level of arousal which, in turn, speeds up the internal clock system underlying the representation of time. The effect of bodily expressions on time perception is thus consistent with findings for other highly arousing emotional stimuli, such as emotional facial expressions.  相似文献   

11.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness-fear and anger-disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness-fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise-sadness and excitement-disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

12.
Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non‐emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD.  相似文献   

13.
There is evidence that men and women display differences in both cognitive and affective functions. Recent studies have examined the processing of emotions in males and females. However, the findings are inconclusive, possibly the result of methodological differences. The aim of this study was to investigate the perception of emotional facial expressions in men and women. Video clips of neutral faces, gradually morphing into full-blown expressions were used. By doing this, we were able to examine both the accuracy and the sensitivity in labelling emotional facial expressions. Furthermore, all participants completed an anxiety and a depression rating scale. Research participants were 40 female students and 28 male students. Results revealed that men were less accurate, as well as less sensitive in labelling facial expressions. Thus, men show an overall worse performance compared to women on a task measuring the processing of emotional faces. This result is discussed in relation to recent findings.  相似文献   

14.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness–fear and anger–disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness–fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise–sadness and excitement–disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

15.
为探讨高特质焦虑者在前注意阶段对情绪刺激的加工模式以明确其情绪偏向性特点, 本研究采用偏差-标准反转Oddball范式探讨了特质焦虑对面部表情前注意加工的影响。结果发现: 对于低特质焦虑组, 悲伤面孔所诱发的早期EMMN显著大于快乐面孔, 而对于高特质焦虑组, 快乐和悲伤面孔所诱发的早期EMMN差异不显著。并且, 高特质焦虑组的快乐面孔EMMN波幅显著大于低特质焦虑组。结果表明, 人格特质是影响面部表情前注意加工的重要因素。不同于普通被试, 高特质焦虑者在前注意阶段对快乐和悲伤面孔存在相类似的加工模式, 可能难以有效区分快乐和悲伤情绪面孔。  相似文献   

16.
Emotion influences memory in many ways. For example, when a mood-dependent processing shift is operative, happy moods promote global processing and sad moods direct attention to local features of complex visual stimuli. We hypothesized that an emotional context associated with to-be-learned facial stimuli could preferentially promote global or local processing. At learning, faces with neutral expressions were paired with a narrative providing either a happy or a sad context. At test, faces were presented in an upright or inverted orientation, emphasizing configural or analytical processing, respectively. A recognition advantage was found for upright faces learned in happy contexts relative to those in sad contexts, whereas recognition was better for inverted faces learned in sad contexts than for those in happy contexts. We thus infer that a positive emotional context prompted more effective storage of holistic, configural, or global facial information, whereas a negative emotional context prompted relatively more effective storage of local or feature-based facial information  相似文献   

17.
It has generally been assumed that high-level cognitive and emotional processes are based on amodal conceptual information. In contrast, however, "embodied simulation" theory states that the perception of an emotional signal can trigger a simulation of the related state in the motor, somatosensory, and affective systems. To study the effect of social context on the mimicry effect predicted by the "embodied simulation" theory, we recorded the electromyographic (EMG) activity of participants when looking at emotional facial expressions. We observed an increase in embodied responses when the participants were exposed to a context involving social valence before seeing the emotional facial expressions. An examination of the dynamic EMG activity induced by two socially relevant emotional expressions (namely joy and anger) revealed enhanced EMG responses of the facial muscles associated with the related social prime (either positive or negative). These results are discussed within the general framework of embodiment theory.  相似文献   

18.
The study investigates cross-modal simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs), using a wide range of different emotions (happiness, sadness, fear, anger, surprise, and disgust). Auditory emotional stimuli (a neutral word pronounced in an affective tone) and visual patterns (emotional facial expressions) were matched in congruous (the same emotion in face and voice) and incongruous (different emotions) pairs. Subjects (N=31) were required to watch and listen to the stimuli in order to comprehend them. Repeated measures ANOVAs showed a positive ERP deflection (P2), more posterior distributed. This P2 effect may represent a marker of cross-modal integration, modulated as a function of congruous/incongruous condition. Indeed, it shows an ampler peak in response to congruous stimuli than incongruous ones. It is suggested P2 can be a cognitive marker of multisensory processing, independently from the emotional content.  相似文献   

19.
Does our perception of others' emotional signals depend on the language we speak or is our perception the same regardless of language and culture? It is well established that human emotional facial expressions are perceived categorically by viewers, but whether this is driven by perceptual or linguistic mechanisms is debated. We report an investigation into the perception of emotional facial expressions, comparing German speakers to native speakers of Yucatec Maya, a language with no lexical labels that distinguish disgust from anger. In a free naming task, speakers of German, but not Yucatec Maya, made lexical distinctions between disgust and anger. However, in a delayed match-to-sample task, both groups perceived emotional facial expressions of these and other emotions categorically. The magnitude of this effect was equivalent across the language groups, as well as across emotion continua with and without lexical distinctions. Our results show that the perception of affective signals is not driven by lexical labels, instead lending support to accounts of emotions as a set of biologically evolved mechanisms.  相似文献   

20.
Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号