首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The notion of social appraisal emphasizes the importance of a social dimension in appraisal theories of emotion by proposing that the way an individual appraises an event is influenced by the way other individuals appraise and feel about the same event. This study directly tested this proposal by asking participants to recognize dynamic facial expressions of emotion (fear, happiness, or anger in Experiment 1; fear, happiness, anger, or neutral in Experiment 2) in a target face presented at the center of a screen while a contextual face, which appeared simultaneously in the periphery of the screen, expressed an emotion (fear, happiness, anger) or not (neutral) and either looked at the target face or not. We manipulated gaze direction to be able to distinguish between a mere contextual effect (gaze away from both the target face and the participant) and a specific social appraisal effect (gaze toward the target face). Results of both experiments provided evidence for a social appraisal effect in emotion recognition, which differed from the mere effect of contextual information: Whereas facial expressions were identical in both conditions, the direction of the gaze of the contextual face influenced emotion recognition. Social appraisal facilitated the recognition of anger, happiness, and fear when the contextual face expressed the same emotion. This facilitation was stronger than the mere contextual effect. Social appraisal also allowed better recognition of fear when the contextual face expressed anger and better recognition of anger when the contextual face expressed fear.  相似文献   

2.
Research suggests that infants progress from discrimination to recognition of emotions in faces during the first half year of life. It is unknown whether the perception of emotions from bodies develops in a similar manner. In the current study, when presented with happy and angry body videos and voices, 5-month-olds looked longer at the matching video when they were presented upright but not when they were inverted. In contrast, 3.5-month-olds failed to match even with upright videos. Thus, 5-month-olds but not 3.5-month-olds exhibited evidence of recognition of emotions from bodies by demonstrating intermodal matching. In a subsequent experiment, younger infants did discriminate between body emotion videos but failed to exhibit an inversion effect, suggesting that discrimination may be based on low-level stimulus features. These results document a developmental change from discrimination based on non-emotional information at 3.5 months to recognition of body emotions at 5 months. This pattern of development is similar to face emotion knowledge development and suggests that both the face and body emotion perception systems develop rapidly during the first half year of life.  相似文献   

3.
This study examined the effects of emotion priming on visual search in participants characterised for different levels of social anxiety. Participants were primed with five facial emotions (angry, fear, happy, neutral, and surprised) and one scrambled face immediately prior to visual search trials involving finding a slanted coloured line amongst distractors, as reaction times and accuracy to target detection were recorded. Results suggest that for individuals low in social anxiety, being primed with an angry, surprised, or fearful face facilitated visual search compared to being primed with scrambled, neutral or happy faces. However, these same emotions degraded visual search in participants with high levels of social anxiety. This study expands on previous research on the impact of emotion on attention, finding that amongst socially anxious individuals, the effects of priming with threat extend beyond initial attention capture or disengagement, degrading later visual search.  相似文献   

4.
The aim of the present study was to contribute to the literature on the ability to recognize anger, happiness, fear, surprise, sadness, disgust, and neutral emotions from facial information (whole face, eye region, mouth region). More specifically, the aim was to investigate older adults' performance in emotions recognition using the same tool used in the previous studies on children and adults’ performance and verify if the pattern of emotions recognition show differences compared with the other two groups. Results showed that happiness is among the easiest emotions to recognize while the disgust is always among the most difficult emotions to recognize for older adults. The findings seem to indicate that is more easily recognizing emotions when pictures represent the whole face; compared with the specific region (eye and mouth regions), older participants seems to recognize more easily emotions when the mouth region is presented. In general, the results of the study did not detect a decay in the ability to recognize emotions from the face, eyes, or mouth. The performance of the old adults is statistically worse than the other two groups in only a few cases: in anger and disgust recognition from the whole face; in anger recognition from the eye region; and in disgust, fear, and neutral emotion recognition from mouth region.  相似文献   

5.
Research has demonstrated that individuals high in implicit prejudice are more likely to classify a racially ambiguous angry face as Black compared to individuals low in implicit prejudice [Hugenberg, K., & Bodenhausen, G. V. (2004). Ambiguity in social categorization. Psychological Science, 15, 342-345]. The current study sought to replicate and extend this finding by examining whether the same expression of anger on a racially ambiguous face is perceived to be differentially intense when the face is judged to be Black or White. White participants viewed racially ambiguous, White, and Black faces displaying angry, neutral, or happy emotions. Participants’ task was to identify the race, emotion, and intensity of emotion display. The results revealed that participants high in implicit prejudice reported significantly more of the racially ambiguous angry faces as Black compared to participants low in implicit prejudice. Further, participants high in implicit prejudice reported the intensity of the racially ambiguous angry emotion as greater when the same face had been categorized as Black compared to White. The results suggest that implicit prejudice is not only associated with the racial categorization of an ambiguous face but also the perceived intensity of the emotion displayed.  相似文献   

6.
Correctly perceiving emotions in others is a crucial part of social interactions. We constructed a set of dynamic stimuli to determine the relative contributions of the face and body to the accurate perception of basic emotions. We also manipulated the length of these dynamic stimuli in order to explore how much information is needed to identify emotions. The findings suggest that even a short exposure time of 250 milliseconds provided enough information to correctly identify an emotion above the chance level. Furthermore, we found that recognition patterns from the face alone and the body alone differed as a function of emotion. These findings highlight the role of the body in emotion perception and suggest an advantage for angry bodies, which, in contrast to all other emotions, were comparable to the recognition rates from the face and may be advantageous for perceiving imminent threat from a distance.  相似文献   

7.
Emotion recognition is mediated by a complex network of cortical and subcortical areas, with the two hemispheres likely being differently involved in processing positive and negative emotions. As results on valence-dependent hemispheric specialisation are quite inconsistent, we carried out three experiments with emotional stimuli with a task being sensitive to measure specific hemispheric processing. Participants were required to bisect visual lines that were delimited by emotional face flankers, or to haptically bisect rods while concurrently listening to emotional vocal expressions. We found that prolonged (but not transient) exposition to concurrent happy stimuli significantly shifted the bisection bias to the right compared to both sad and neutral stimuli, indexing a greater involvement of the left hemisphere in processing of positively connoted stimuli. No differences between sad and neutral stimuli were observed across the experiments. In sum, our data provide consistent evidence in favour of a greater involvement of the left hemisphere in processing positive emotions and suggest that (prolonged) exposure to stimuli expressing happiness significantly affects allocation of (spatial) attentional resources, regardless of the sensory (visual/auditory) modality in which the emotion is perceived and space is explored (visual/haptic).  相似文献   

8.
For centuries, folk theory has promoted the idea that positive emotions are good for your health. Accumulating empirical evidence is providing support for this anecdotal wisdom. We use the broaden-and-build theory of positive emotions (Fredrickson, 1998; 2001) as a framework to demonstrate that positive emotions contribute to psychological and physical well-being via more effective coping. We argue that the health benefits advanced by positive emotions may be instantiated in certain traits that are characterized by the experience of positive emotion. Towards this end, we examine individual differences in psychological resilience (the ability to bounce back from negative events by using positive emotions to cope) and positive emotional granularity (the tendency to represent experiences of positive emotion with precision and specificity). Individual differences in these traits are examined in two studies, one using psychophysiological evidence, the second using evidence from experience sampling, to demonstrate that positive emotions play a crucial role in enhancing coping resources in the face of negative events. Implications for research on coping and health are discussed.  相似文献   

9.
Considerable evidence indicates that the amygdala plays a critical role in negative, aversive human emotions. Although researchers have speculated that the amygdala plays a role in positive emotion, little relevant evidence exists. We examined the neural correlates of positive and negative emotion using positron emission tomography (PET), focusing on the amygdala. Participants viewed positive and negative photographs, as well as interesting and uninteresting neutral photographs, during PET scanning. The left amygdala and ventromedial prefrontal cortex were activated during positive emotion, and bilateral amygdala activation occurred during negative emotion. High-interest, unusual photographs also elicited left-amygdala activation, a finding consistent with suggestions that the amygdala is involved in vigilance reactions to associatively ambiguous stimuli. The current results constitute the first neuroimaging evidence for a role of the amygdala in positive emotional reactions elicited by visual stimuli. Although the amygdala appears to play a more extensive role in negative emotion, it is involved in positive emotion as well.  相似文献   

10.
People implicitly associate different emotions with different locations in left‐right space. Which aspects of emotion do they spatialize, and why? Across many studies people spatialize emotional valence, mapping positive emotions onto their dominant side of space and negative emotions onto their non‐dominant side, consistent with theories of metaphorical mental representation. Yet other results suggest a conflicting mapping of emotional intensity (a.k.a., emotional magnitude), according to which people associate more intense emotions with the right and less intense emotions with the left — regardless of their valence; this pattern has been interpreted as support for a domain‐general system for representing magnitudes. To resolve the apparent contradiction between these mappings, we first tested whether people implicitly map either valence or intensity onto left‐right space, depending on which dimension of emotion they attend to (Experiments 1a, b). When asked to judge emotional valence, participants showed the predicted valence mapping. However, when asked to judge emotional intensity, participants showed no systematic intensity mapping. We then tested an alternative explanation of findings previously interpreted as evidence for an intensity mapping (Experiments 2a, b). These results suggest that previous findings may reflect a left‐right mapping of spatial magnitude (i.e., the size of a salient feature of the stimuli) rather than emotion. People implicitly spatialize emotional valence, but, at present, there is no clear evidence for an implicit lateral mapping of emotional intensity. These findings support metaphor theory and challenge the proposal that mental magnitudes are represented by a domain‐general metric that extends to the domain of emotion.  相似文献   

11.
Findings from subjects with unilateral brain damage, as well as from normal subjects studied with tachistoscopic paradigms, argue that emotion is processed differently by each brain hemisphere. An open question concerns the extent to which such lateralised processing might occur under natural, freeviewing conditions. To explore this issue, we asked 28 normal subjects to discriminate emotions expressed by pairs of faces shown side-by-side, with no time or viewing constraints. Images of neutral expressions were shown paired with morphed images of very faint emotional expressions (happiness, surprise, disgust, fear, anger, or sadness). We found a surprising and robust laterality effect: When discriminating negative emotional expressions, subjects performed significantly better when the emotional face was to the left of the neutral face; conversely, when discriminating positive expressions, subjects performed better when the emotional face was to the right. We interpret this valence-specific laterality effect as consistent with the idea that the right hemisphere is specialised to process negative emotions, whereas the left is specialised to process positive emotions. The findings have important implications for how humans perceive facial emotion under natural conditions.  相似文献   

12.
People nowadays do not only display facial expressions in face to face communication but also in on-line communication using graphical symbols named emojis. The present study explored 30-month-old toddlers’ ability to recognize emojis that represented six basic human emotions. In the study, 38 toddlers first saw scenarios that elicited different emotions in an actor and were asked to visually identify the matching emoji in the presence of a distractor. Eye-tracking results showed that toddlers could correctly identify the emoji that represented the emotion in each scenario. Toddlers then heard different emotion words and were again found to be able to identify the matching emoji. These findings provide preliminary evidence that the ability to recognize and understand the emotional value of emotion symbols in the virtual world emerges early in development.  相似文献   

13.
跨期决策的研究表明, 积极情绪和消极情绪状态下的跨期决策行为存在显著差异。本研究从单维占优模型的角度, 揭示情绪影响跨期决策的过程机制。实验1通过诱发被试的积极和消极情绪, 发现积极情绪下被试的时间折扣率更低, 有更强的选择延迟选项的倾向。实验2运用“模拟天平任务”测量了跨期决策时的维度间差异比较, 检验单维占优模型对情绪影响跨期决策的解释性。结果发现, 维度间差异比较在情绪对跨期决策的影响中起中介作用。实验3a和实验3b分别运用时间和金钱启动策略操纵维度间差异比较过程, 再次验证单维占优模型的解释作用。 结果发现, 情绪对跨期决策的效应随着时间和金钱的启动而消失, 进一步支持了维度间差异比较的中介作用。本研究从决策过程的角度, 揭示了情绪影响跨期决策的心理机制, 并进一步为单维占优模型对跨期决策行为的解释性增加了支持性证据。  相似文献   

14.
Past research on the relationship between affect and creativity has yielded contradictory results. Most of the evidence has tended to show that brief positive emotions as well as more enduring positive moods enhance creativity. No study to date, however, has attempted to determine whether the influence of momentary emotions on creativity depends on pre-existing moods. In the present study, 96 undergraduates completed one of two creative tasks (generating or evaluating captions for photographs) on three occasions, after watching videos designed to induce positive, neutral, or negative emotions. Participants also completed a questionnaire assessing depressed mood. Results confirmed that the effect of emotion inductions on creativity depended on pre-existing mood. Participants low in depression wrote more creative captions and rated captions more accurately with induced negative emotion than with induced positive emotion. In contrast, participants high in depression appeared impervious to the effect of emotion inductions.  相似文献   

15.
Due to mood-congruency effects, we expect the emotion perceived on a face to be biased towards one's own mood. But the findings in the scant literature on such mood effects in normal healthy populations have not consistently and adequately supported this expectation. Employing effective mood manipulation techniques that ensured that the intended mood was sustained throughout the perception task, we explored mood-congruent intensity and recognition accuracy biases in emotion perception. Using realistic face stimuli with expressive cues of happiness and sadness, we demonstrated that happy, neutral and ambiguous expressions were perceived more positively in the positive than in the negative mood. The mood-congruency effect decreased with the degree of perceived negativity in the expression. Also, males were more affected by the mood-congruency effect in intensity perception than females. We suggest that the greater salience and better processing of negative stimuli and the superior cognitive ability of females in emotion perception are responsible for these observations. We found no evidence for mood-congruency effect in the recognition accuracy of emotions and suggest with supporting evidence that past reports of this effect may be attributed to response bias driven by mood.  相似文献   

16.
In this paper, we study existing models of emotion space using centrality, which is borrowed from network theory, to identify key emotions as the central nodes in a network, for the purposes of understanding the existing emotion spaces better in a new way. With several different definitions of centrality, key emotions are identified for four existing emotion space models. We also propose a method for integrating existing spaces to build a refined space with more emotion terms. Each model identified different key emotions. When we reduced emotion spaces such that they each contained 21 common emotions, the key emotions identified remained different, implying fundamental structural differences among existing emotion space models. These findings call for further experimental verification and the refinement of emotion models for future research to make it more useful in emotion research.  相似文献   

17.
Recent research suggests that eye-gaze direction modulates perceived emotional expression. Here we explore the extent to which emotion affects interpretation of attention direction. We captured three-dimensional face models of 8 actors expressing happy, fearful, angry and neutral emotions. From these 3D models 9 views were extracted (0°, 2°, 4°, 6°, 8° to the left and right). These stimuli were randomly presented for 150 ms. Using a forced-choice paradigm 28 participants judged for each face whether or not it was attending to them. Two conditions were tested: either the whole face was visible, or the eyes were covered. In both conditions happy faces elicited most "attending-to-me" answers. Thus, emotional expression has a more general effect than an influence on gaze direction: emotion affects interpretation of attention direction. We interpret these results as a self-referential positivity bias, suggesting a general preference to associate a happy face with the self.  相似文献   

18.
To date little evidence is available as to how emotional facial expression is decoded, specifically whether a bottom-up (data-driven) or a top-down (schema-driven) approach is more appropriate in explaining the decoding of emotions from facial expression. A study is reported (conducted with N = 20 subjects each in Germany and Italy), in which decoders judged emotions from photographs of facial expressions. Stimuli represented a selection of photographs depicting both single muscular movements (action units) in an otherwise neutral face, and combinations of such action units. Results indicate that the meaning of action units changes often with context; only a few single action units transmit specific emotional meaning, which they retain when presented in context. The results are replicated to a large degree across decoder samples in both nations, implying fundamental mechanisms of emotion decoding.  相似文献   

19.
Evidence suggests that autism is associated with impaired emotion perception, but it is unknown how early such impairments are evident. Furthermore, most studies that have assessed emotion perception in children with autism have required verbal responses, making results difficult to interpret. This study utilized high-density event-related potentials (ERPs) to investigate whether 3-4-year-old children with autism spectrum disorder (ASD) show differential brain activity to fear versus neutral facial expressions. It has been shown that normal infants as young as 7 months of age show differential brain responses to faces expressing different emotions. ERPs were recorded while children passively viewed photos of an unfamiliar woman posing a neutral and a prototypic fear expression. The sample consisted of 29 3-4-year-old children with ASD and 22 chronological age-matched children with typical development. Typically developing children exhibited a larger early negative component (N300) to the fear than to the neutral face. In contrast, children with ASD did not show the difference in amplitude of this early ERP component to the fear versus neutral face. For a later component, typically developing children exhibited a larger negative slow wave (NSW) to the fear than to the neutral face, whereas children with autism did not show a differential NSW to the two stimuli. In children with ASD, faster speed of early processing (i. e. N300 latency) of the fear face was associated with better performance on tasks assessing social attention (social orienting, joint attention and attention to distress). These data suggest that children with ASD, as young as 3 years of age, show a disordered pattern of neural responses to emotional stimuli.  相似文献   

20.
Empirical tests of the "right hemisphere dominance" versus "valence" theories of emotion processing are confounded by known sex differences in lateralization. Moreover, information about the sex of the person posing an emotion might be processed differently by men and women because of an adaptive male bias to notice expressions of threat and vigilance in other male faces. The purpose of this study was to investigate whether sex of poser and emotion displayed influenced lateralization in men and women by analyzing "laterality quotient" scores on a test which depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression. We found that men (N = 50) were significantly more lateralized for emotions indicative of vigilance and threat (happy, sad, angry, and surprised) in male faces relative to female faces and compared to women (N = 44). These data indicate that sex differences in functional cerebral lateralization for facial emotion may be specific to the emotion presented and the sex of face presenting it.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号