首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Emotional stimuli receive high processing priority in attention and memory. This processing "advantage" is generally thought to be predominantly mediated by arousal. However, recent data suggest that ratings of an image's affective "impact" may be a better predictor of recollection than arousal or valence. One interpretation of these findings is that high-impact images may draw an individual's attention, thus facilitating subsequent processing. We investigated the allocation of visual attention to negative emotional images that differed in impact but were matched for valence, arousal, and other characteristics. Participants viewed a central image flanked by 2 neutral indoor or outdoor scenes and made speeded judgments about whether the neutral scenes matched. In Experiment 1, responses were slower on high-impact relative to low-impact or neutral trials. In Experiment 2, responses on high-arousal relative to low-arousal trials did not differ significantly. These data provide evidence for differential allocation of attention to distinct sets of negative, equally arousing images, and argue against the prevailing view that heightened attention to and processing of emotional stimuli relate simply to arousal or valence.  相似文献   

2.
People who listen to a narrative concerning another's experience feel the urge to share in turn their experience of listening. This phenomenon is called secondary social sharing of emotion and has been widely investigated in the last ten years (Christophe & Di Giacomo, 1995; Christophe & Rimé, 1997). The present two studies aimed to provide new evidence concerning secondary social sharing of emotion. In the first study, participants were asked to recall an emotional narrative they had been told no more than three months before and to specify their social sharing about the narrative. In the second study, a diary strategy was used in order to encourage participants to recall an emotional narrative they had listened to during the day that had just elapsed. A follow‐up, three weeks after the completion of the diaries, was used to assess secondary social sharing over time. Results from both studies confirmed that secondary social sharing is a widespread phenomenon, involving many partners, mainly belonging to the circle of intimates, and affected by the intensity of listeners' emotional reactions. Adults exhibited significantly higher ratings of secondary social sharing than young people. In the first study, the valence (positive vs. negative) of the emotional experience affected secondary social sharing. However, no differences were found for sharing positive and negative experiences in the diary study.  相似文献   

3.
Women's memories of emotional events differing by both valence and intensity were examined for differences in narrative content and structure, as well as subjective memory ratings. Emotional valence was related to the content of the women's narratives, and emotional intensity was related to the subjective ratings of the memories. Negative narratives contained more negative emotion, cognitive processing words, and passive sentences than positive narratives, and positive narratives contained more positive emotion words and were more complex than negative narratives. Intensely negative narratives were the longest and the least complex, and intensely positive narratives were the most coherent. Women rated both intensely negative and intensely positive events, in general, as more frequently talked/thought about, significant, unique, emotional, and vivid than moderately emotional events, and negative events were rated as more emotional than positive narratives. There was little relation between the objective content of the narratives and the women's subjective ratings of their memory experiences. Finally, researcher‐defined traumatic events did not differ from other intensely negative events. The results of this study have important implications for narrative research in general, methodological issues such as the validity of text analysis programs and subjective memory ratings, and the quality of traumatic memories. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

4.
Arousal and valence have long been studied as the two primary dimensions for the perception of emotional stimuli such as facial expressions. Prior correlational studies that tested emotion perception along these dimensions found broad similarities between adults and children. However, few studies looked for direct differences between children and adults in these dimensions beyond correlation. We tested 9-year-old children and adults on rating positive and negative facial stimuli based on emotional arousal and valence. Despite high significant correlations between children’s and adults’ ratings, our findings also showed significant differences between children and adults in terms of rating values: Children rated all expressions as significantly more positive than adults in valence. Children also rated positive emotions as more arousing than adults. Our results show that although perception of facial emotions along arousal and valence follows similar patterns in children and adults, some differences in ratings persist, and vary by emotion type.  相似文献   

5.
人类情绪无疑有强度的差异。然而, 长期以来人类情绪强度的易感性未能引起足够关注。通过系统操纵刺激材料的效价强度, 作者及研究团队分别从情绪易感性本身, 情绪影响高级认知, 以及个体差异三个层次系统探讨了人类对情绪事件效价强度的易感性及其神经机制。结果发现:1)、相比正性刺激, 人类对负性刺激的效价强度更敏感, 这一效应可能跟右侧海马/杏仁核的警觉功能有关联。2)、与上述结果相一致, 不同效价强度的负性情绪对新异性加工、目标觉察、执行控制等高级认知过程具有显著不同的影响。3)、人类情绪强度的易感性具有显著个体差异:相比男性, 女性对轻度负性情绪事件易感性更强。相比中向人群, 外倾人群对愉悦刺激的情绪易感性更强, 而对轻度负性事件的情绪易感性更弱。  相似文献   

6.
Previous findings regarding the relationship between emotional valence and psychological distance were mixed. The current research examined whether emotional intensity moderates the influence of emotional valence on psychological distance. We manipulated intensity and valence by asking participants to describe a positive or negative event from either a high intensity or low intensity perspective. Studies 1 and 2 revealed that negative events were perceived to be more distant than positive events in the low‐intensity condition in two distinct cultural groups. Study 3 further proved that the obtained patterns were generalizable to different emotions. Finally, Study 4 found that a reduced alert level, but not perceived threats, mediated the interactive effects of valence and intensity on psychological distance. This research highlights the importance of considering the joint effect of different dimensions of emotion, thus advancing the understanding of complex processes of emotion.  相似文献   

7.
Studies using facial emotional expressions as stimuli partially support the assumption of biased processing of social signals in social phobia. This pilot study explored for the first time whether individuals with social phobia display a processing bias towards emotional prosody. Fifteen individuals with generalized social phobia and fifteen healthy controls (HC) matched for gender, age, and education completed a recognition test consisting of meaningless utterances spoken in a neutral, angry, sad, fearful, disgusted or happy tone of voice. Participants also evaluated the stimuli with regard to valence and arousal. While these ratings did not differ significantly between groups, analysis of the recognition test revealed enhanced identification of sad and fearful voices and decreased identification of happy voices in individuals with social phobia compared with HC. The two groups did not differ in their processing of neutral, disgust, and anger prosody.  相似文献   

8.
Four different patterns of biased ratings of facial expressions of emotions have been found in socially anxious participants: higher negative ratings of (1) negative, (2) neutral, and (3) positive facial expressions than nonanxious controls. As a fourth pattern, some studies have found no group differences in ratings of facial expressions of emotion. However, these studies usually employed valence and arousal ratings that arguably may be less able to reflect processing of social information. We examined the relationship between social anxiety and face ratings for perceived trustworthiness given that trustworthiness is an inherently socially relevant construct. Improving on earlier analytical strategies, we evaluated the four previously found result patterns using a Bayesian approach. Ninety-eight undergraduates rated 198 face stimuli on perceived trustworthiness. Subsequently, participants completed social anxiety questionnaires to assess the severity of social fears. Bayesian modeling indicated that the probability that social anxiety did not influence judgments of trustworthiness had at least three times more empirical support in our sample than assuming any kind of negative interpretation bias in social anxiety. We concluded that the deviant interpretation of facial trustworthiness is not a relevant aspect in social anxiety.  相似文献   

9.
In the present study, we collected valence, arousal, concreteness, familiarity, imageability, and context availability ratings for a total of 1,100 Chinese words. The ratings for all variables were collected with 9-point Likert scales. We tested the reliability of the present database by comparing it to the extant Chinese Affective Word System, and performed split-half correlations for all six variables. We then evaluated the relationships between all variables. Regarding the affective variables, we found a typical quadratic relation between valence and arousal, in line with previous findings. Likewise, significant correlations were found between the semantic variables. Importantly, we explored the relationships between ratings for the affective variables (i.e., valence and arousal) and concreteness ratings, suggesting that valence and arousal ratings can predict concreteness ratings. This database of affective norms will be a valuable source of information for emotion research that makes use of Chinese words, and will enable researchers to use highly controlled Chinese verbal stimuli to more reliably investigate the relation between cognition and emotion.  相似文献   

10.
The present study addressed the hypothesis that emotional stimuli relevant to survival or reproduction (biologically emotional stimuli) automatically affect cognitive processing (e.g., attention, memory), while those relevant to social life (socially emotional stimuli) require elaborative processing to modulate attention and memory. Results of our behavioral studies showed that (1) biologically emotional images hold attention more strongly than do socially emotional images, (2) memory for biologically emotional images was enhanced even with limited cognitive resources, but (3) memory for socially emotional images was enhanced only when people had sufficient cognitive resources at encoding. Neither images’ subjective arousal nor their valence modulated these patterns. A subsequent functional magnetic resonance imaging study revealed that biologically emotional images induced stronger activity in the visual cortex and greater functional connectivity between the amygdala and visual cortex than did socially emotional images. These results suggest that the interconnection between the amygdala and visual cortex supports enhanced attention allocation to biological stimuli. In contrast, socially emotional images evoked greater activity in the medial prefrontal cortex (MPFC) and yielded stronger functional connectivity between the amygdala and MPFC than did biological images. Thus, it appears that emotional processing of social stimuli involves elaborative processing requiring frontal lobe activity.  相似文献   

11.
It has been suggested that a high tendency to ruminate presents a deficient emotion regulation. Past research found that people with high tendency to ruminate show sustained attention for negative stimuli and increased negative thinking, which may result in intensified experiences of negative emotions. Moreover, high level of rumination was associated with low emotional understanding. Accordingly, we hypothesized (1) high ruminators (HR) experience more intense emotional reactions than low ruminators (LR) for negative but not positive emotions, (2) LR have higher emotional clarity than HR, and (3) there would be the same pattern of results for brooding but not for reflective pondering. Participants completed a demographic questionnaire, a rumination response style questionnaire, and the Beck Depression Inventory-II. They also rated emotional intensity and identified emotion type for scene pictures from the CAP-D (Categorized Affective Pictures Database). The highest (HR) and lowest (LR) quarters of ruminators were compared on levels of emotional intensity and emotional clarity. We found HR experienced negative emotions more intensely than LR, with no difference for positive emotions. In contrast to our hypothesis, the two groups did not differ in their emotion understanding. This pattern of results was found for brooding but not for reflective pondering. Our research sheds light on the mechanism underlying rumination and emotion regulation.  相似文献   

12.
Emotional faces and scenes carry a wealth of overlapping and distinct perceptual information. Despite widespread use in the investigation of emotional perception, expressive face and evocative scene stimuli are rarely assessed in the same experiment. Here, we evaluated self-reports of arousal and pleasantness, as well as early and late event-related potentials (e.g., N170, early posterior negativity [EPN], late positive potential [LPP]) as subjects viewed neutral and emotional faces and scenes, including contents representing anger, fear, and joy. Results demonstrate that emotional scenes were rated as more evocative than emotional faces, as only scenes produced elevated self-reports of arousal. In addition, viewing scenes resulted in more extreme ratings of pleasantness (and unpleasantness) than did faces. EEG results indicate that both expressive faces and emotional scenes evoke enhanced negativity in the N170 component, while the EPN and LPP components show significantly enhanced modulation only by scene, relative to face stimuli. These data suggest that viewing emotional scenes results in a more pronounced emotional experience that is associated with reliable modulation of visual event-related potentials that are implicated in emotional circuits in the brain.  相似文献   

13.
The present study examined whether information processing bias against emotional facial expressions is present among individuals with social anxiety. College students with high (high social anxiety group; n  = 26) and low social anxiety (low social anxiety group; n  = 26) performed three different types of working memory tasks: (a) ordering positive and negative facial expressions according to the intensity of emotion; (b) ordering pictures of faces according to age; and (c) ordering geometric shapes according to size. The high social anxiety group performed significantly more poorly than the low social anxiety group on the facial expression task, but not on the other two tasks with the nonemotional stimuli. These results suggest that high social anxiety interferes with processing of emotionally charged facial expressions.  相似文献   

14.
Perceptual processing of natural scene pictures is enhanced when the scene conveys emotional content. Such “motivated attention” to pleasant and unpleasant pictures has been shown to improve identification accuracy in non-speeded behavioural tasks. An open question is whether emotional content also modulates the speed of visual scene processing. In the present studies we show that unpleasant content reliably slowed two-choice categorization of pictures, irrespective of physical image properties, perceptual complexity, and categorization instructions. Conversely, pleasant content did not slow or even accelerated choice reactions, relative to neutral scenes. As indicated by lateralized readiness potentials, these effects occurred at cognitive processing rather than motor preparation/execution stages. Specifically, analysis of event-related potentials showed a prolongation of early scene discrimination for stimuli perceived as emotionally arousing, regardless of valence, and reflected in delayed peaks of the N1 component. In contrast, the timing of other processing steps, reflected in the P2 and late positive potential components and presumably related to post-discriminatory processes such as stimulus–response mapping, appeared to be determined by hedonic valence, with more pleasant scenes eliciting faster processing. Consistent with this model, varying arousal (low/high) within the emotional categories mediated the effects of valence on choice reaction speed. Functionally, arousal may prolong stimulus analysis in order to prevent erroneous and potentially harmful decisions. Pleasantness may act as a safety signal allowing rapid initiation of overt responses.  相似文献   

15.
Previous research has shown that redundant information in faces and voices leads to faster emotional categorization compared to incongruent emotional information even when attending to only one modality. The aim of the present study was to test whether these crossmodal effects are predominantly due to a response conflict rather than interference at earlier, e.g. perceptual processing stages. In Experiment 1, participants had to categorize the valence and rate the intensity of happy, sad, angry and neutral unimodal or bimodal face–voice stimuli. They were asked to rate either the facial or vocal expression and ignore the emotion expressed in the other modality. Participants responded faster and more precisely to emotionally congruent compared to incongruent face–voice pairs in both the Attend Face and in the Attend Voice condition. Moreover, when attending to faces, emotionally congruent bimodal stimuli were more efficiently processed than unimodal visual stimuli. To study the role of a possible response conflict, Experiment 2 used a modified paradigm in which emotional and response conflicts were disentangled. Incongruency effects were significant even in the absence of response conflicts. The results suggest that emotional signals available through different sensory channels are automatically combined prior to response selection.  相似文献   

16.
Facial stimuli are widely used in behavioural and brain science research to investigate emotional facial processing. However, some studies have demonstrated that dynamic expressions elicit stronger emotional responses compared to static images. To address the need for more ecologically valid and powerful facial emotional stimuli, we created Dynamic FACES, a database of morphed videos (n?=?1026) from younger, middle-aged, and older adults displaying naturalistic emotional facial expressions (neutrality, sadness, disgust, fear, anger, happiness). To assess adult age differences in emotion identification of dynamic stimuli and to provide normative ratings for this modified set of stimuli, healthy adults (n?=?1822, age range 18–86 years) categorised for each video the emotional expression displayed, rated the expression distinctiveness, estimated the age of the face model, and rated the naturalness of the expression. We found few age differences in emotion identification when using dynamic stimuli. Only for angry faces did older adults show lower levels of identification accuracy than younger adults. Further, older adults outperformed middle-aged adults’ in identification of sadness. The use of dynamic facial emotional stimuli has previously been limited, but Dynamic FACES provides a large database of high-resolution naturalistic, dynamic expressions across adulthood. Information on using Dynamic FACES for research purposes can be found at http://faces.mpib-berlin.mpg.de.  相似文献   

17.
This study addressed the relative reliance on face and body configurations for different types of emotion-related judgements: emotional state and motion intention. Participants viewed images of people with either emotionally congruent (both angry or fearful) or incongruent (angry/fearful; fearful/angry) faces and bodies. Congruent conditions provided baseline responses. Incongruent conditions revealed relative reliance on face and body information for different judgements. Body configurations influenced motion-intention judgements more than facial configurations: incongruent pairs with angry bodies were more frequently perceived as moving forward than those with fearful bodies; pairs with fearful bodies were more frequently perceived as moving away. In contrast, faces influenced emotional-state judgements more, but bodies moderated ratings of face emotion. Thus, both face and body configurations influence emotion perception, but the type of evaluation required influences their relative contributions. These findings highlight the importance of considering both the face and body as important sources of emotion information.  相似文献   

18.
This study addressed the relative reliance on face and body configurations for different types of emotion-related judgements: emotional state and motion intention. Participants viewed images of people with either emotionally congruent (both angry or fearful) or incongruent (angry/fearful; fearful/angry) faces and bodies. Congruent conditions provided baseline responses. Incongruent conditions revealed relative reliance on face and body information for different judgements. Body configurations influenced motion-intention judgements more than facial configurations: incongruent pairs with angry bodies were more frequently perceived as moving forward than those with fearful bodies; pairs with fearful bodies were more frequently perceived as moving away. In contrast, faces influenced emotional-state judgements more, but bodies moderated ratings of face emotion. Thus, both face and body configurations influence emotion perception, but the type of evaluation required influences their relative contributions. These findings highlight the importance of considering both the face and body as important sources of emotion information.  相似文献   

19.
Perception of emotion is critical for successful social interaction, yet the neural mechanisms underlying the perception of dynamic, audio-visual emotional cues are poorly understood. Evidence from language and sensory paradigms suggests that the superior temporal sulcus and gyrus (STS/STG) play a key role in the integration of auditory and visual cues. Emotion perception research has focused on static facial cues; however, dynamic audio-visual (AV) cues mimic real-world social cues more accurately than static and/or unimodal stimuli. Novel dynamic AV stimuli were presented using a block design in two fMRI studies, comparing bimodal stimuli to unimodal conditions, and emotional to neutral stimuli. Results suggest that the bilateral superior temporal region plays distinct roles in the perception of emotion and in the integration of auditory and visual cues. Given the greater ecological validity of the stimuli developed for this study, this paradigm may be helpful in elucidating the deficits in emotion perception experienced by clinical populations.  相似文献   

20.
Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号