首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This study examined to what extent children and adults differ in how they process negative emotions during reading, and how they rate their own and protagonists’ emotional states. Results show that both children’s and adults’ processing of target sentences was facilitated when they described negative emotions. Processing of spill-over sentences was facilitated for adults but inhibited for children, suggesting children needed additional time to process protagonists’ emotional states and integrate them into coherent mental representations. Children and adults were similar in their valence and arousal ratings as they rated protagonists’ emotional states as more negative and more intense than their own emotional states. However, they differed in that children rated their own emotional states as relatively neutral, whereas adults’ ratings of their own emotional states more closely matched the negative emotional states of the protagonists. This suggests a possible difference between children and adults in the mechanism underlying emotional inferencing.  相似文献   

2.
Studies on adults have revealed a disadvantageous effect of negative emotional stimuli on executive functions (EF), and it is suggested that this effect is amplified in children. The present study’s aim was to assess how emotional facial expressions affected working memory in 9- to 12-year-olds, using a working memory task with emotional facial expressions as stimuli. Additionally, we explored how degree of internalizing and externalizing symptoms in typically developing children was related to performance on the same task. Before employing the working memory task with emotional facial expressions as stimuli, an independent sample of 9- to 12-year-olds was asked to recognize the facial expressions intended to serve as stimuli for the working memory task and to rate the facial expressions on the degree to which the emotion was expressed and for arousal to obtain a baseline for how children during this age recognize and react to facial expressions. The first study revealed that children rated the facial expressions with similar intensity and arousal across age. When employing the working memory task with facial expressions, results revealed that negatively valenced expressions impaired working memory more than neutral and positively valenced expressions. The ability to successfully complete the working memory task increased between 9 to 12 years of age. Children’s total problems were associated with poorer performance on the working memory task with facial expressions. Results on the effect of emotion on working memory are discussed in light of recent models and empirical findings on how emotional information might interact and interfere with cognitive processes such as working memory.  相似文献   

3.
4.
Older adults perceive less intense negative emotion in facial expressions compared to younger counterparts. Prior research has also demonstrated that mood alters facial emotion perception. Nevertheless, there is little evidence which evaluates the interactive effects of age and mood on emotion perception. This study investigated the effects of sad mood on younger and older adults’ perception of emotional and neutral faces. Participants rated the intensity of stimuli while listening to sad music and in silence. Measures of mood were administered. Younger and older participants’ rated sad faces as displaying stronger sadness when they experienced sad mood. While younger participants showed no influence of sad mood on happiness ratings of happy faces, older adults rated happy faces as conveying less happiness when they experienced sad mood. This study demonstrates how emotion perception can change when a controlled mood induction procedure is applied to alter mood in young and older participants.  相似文献   

5.
Arousal and valence (pleasantness) are considered primary dimensions of emotion. However, the degree to which these dimensions interact in emotional processing across sensory modalities is poorly understood. We addressed this issue by applying a crossmodal priming paradigm in which auditory primes (Romantic piano solo music) varying in arousal and/or pleasantness were sequentially paired with visual targets (IAPS pictures). In Experiment 1, the emotion spaces of 120 primes and 120 targets were explored separately in addition to the effects of musical training and gender. Thirty-two participants rated their felt pleasantness and arousal in response to primes and targets on equivalent rating scales as well as their familiarity with the stimuli. Musical training was associated with elevated familiarity ratings for high-arousing music and a trend for elevated arousal ratings, especially in response to unpleasant musical stimuli. Males reported higher arousal than females for pleasant visual stimuli. In Experiment 2, 40 nonmusicians rated their felt arousal and pleasantness in response to 20 visual targets after listening to 80 musical primes. Arousal associated with the musical primes modulated felt arousal in response to visual targets, yet no such transfer of pleasantness was observed between the two modalities. Experiment 3 sought to rule out the possibility of any order effect of the subjective ratings, and responses of 14 nonmusicians replicated results of Experiment 2. This study demonstrates the effectiveness of the crossmodal priming paradigm in basic research on musical emotions.  相似文献   

6.
Remembering is impacted by several factors of retrieval, including the emotional content of a memory cue. Here we tested how musical retrieval cues that differed on two dimensions of emotion—valence (positive and negative) and arousal (high and low)—impacted the following aspects of autobiographical memory recall: the response time to access a past personal event, the experience of remembering (ratings of memory vividness), the emotional content of a cued memory (ratings of event arousal and valence), and the type of event recalled (ratings of event energy, socialness, and uniqueness). We further explored how cue presentation affected autobiographical memory retrieval by administering cues of similar arousal and valence levels in a blocked fashion to one half of the tested participants, and randomly to the other half. We report three main findings. First, memories were accessed most quickly in response to musical cues that were highly arousing and positive in emotion. Second, we observed a relation between a cue and the elicited memory’s emotional valence but not arousal; however, both the cue valence and arousal related to the nature of the recalled event. Specifically, high cue arousal led to lower memory vividness and uniqueness ratings, but cues with both high arousal and positive valence were associated with memories rated as more social and energetic. Finally, cue presentation impacted both how quickly and specifically memories were accessed and how cue valence affected the memory vividness ratings. The implications of these findings for views of how emotion directs the access to memories and the experience of remembering are discussed.  相似文献   

7.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness-fear and anger-disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness-fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise-sadness and excitement-disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

8.
Four different patterns of biased ratings of facial expressions of emotions have been found in socially anxious participants: higher negative ratings of (1) negative, (2) neutral, and (3) positive facial expressions than nonanxious controls. As a fourth pattern, some studies have found no group differences in ratings of facial expressions of emotion. However, these studies usually employed valence and arousal ratings that arguably may be less able to reflect processing of social information. We examined the relationship between social anxiety and face ratings for perceived trustworthiness given that trustworthiness is an inherently socially relevant construct. Improving on earlier analytical strategies, we evaluated the four previously found result patterns using a Bayesian approach. Ninety-eight undergraduates rated 198 face stimuli on perceived trustworthiness. Subsequently, participants completed social anxiety questionnaires to assess the severity of social fears. Bayesian modeling indicated that the probability that social anxiety did not influence judgments of trustworthiness had at least three times more empirical support in our sample than assuming any kind of negative interpretation bias in social anxiety. We concluded that the deviant interpretation of facial trustworthiness is not a relevant aspect in social anxiety.  相似文献   

9.
Emotion can be conceptualized by the dimensional account of emotion with the dimensions of valence and arousal. There is little discussion of the difference in discriminability across the dimensions. The present study hypothesized that any pair of emotional expressions differing in the polarity of both valence and arousal dimensions would be easier to distinguish than a pair differing in only one dimension. The results indicate that the difference in the dimensions did not affect participants’ reaction time. Most pairs of emotional expressions, except those involving fear, were similarly discriminative. Reaction times to pairs with a fearful expression were faster than to those without. The fast reaction time to fearful facial expressions underscores the survival value of emotions.  相似文献   

10.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness–fear and anger–disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness–fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise–sadness and excitement–disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

11.
People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture.  相似文献   

12.
We propose a computational model for identifying emotional state of a facial expression from appraisal scores given by human observers utilizing their differences in perception. The appraisal model of human emotion is adopted as the basis of this evaluation process with appraisal variables as output. We investigated the performance for both categorical and continuous representation of the variables appraised by human observers. Analysis of the data exhibits higher degree of agreement between estimated Indian ratings and the available reference when these are rated through continuous domain. We also observed that emotional state with negative valence are influential in the perception of hybrid emotional state like ‘Surprise’, only when appraisal variables are labeled through categories of emotions. Thus, the proposed method has implications in developing software to detect emotion using appraisal variables in continuous domain, perceived from facial expression of an agent (or human subject). Further, this model can be customized to include cultural variability in recognizing emotions.  相似文献   

13.
The ability to recognize others’ facial emotions has become increasingly important after the COVID-19 pandemic, which causes stressful situations in emotion regulation. Considering the importance of emotion in maintaining a social life, emotion knowledge to perceive and label emotions of oneself and others requires an understanding of affective dimensions, such as emotional valence and emotional arousal. However, limited information is available about whether the behavioral representation of affective dimensions is similar to their neural representation. To explore the relationship between the brain and behavior in the representational geometries of affective dimensions, we constructed a behavioral paradigm in which emotional faces were categorized into geometric spaces along the valence, arousal, and valence and arousal dimensions. Moreover, we compared such representations to neural representations of the faces acquired by functional magnetic resonance imaging. We found that affective dimensions were similarly represented in the behavior and brain. Specifically, behavioral and neural representations of valence were less similar to those of arousal. We also found that valence was represented in the dorsolateral prefrontal cortex, frontal eye fields, precuneus, and early visual cortex, whereas arousal was represented in the cingulate gyrus, middle frontal gyrus, orbitofrontal cortex, fusiform gyrus, and early visual cortex. In conclusion, the current study suggests that dimensional emotions are similarly represented in the behavior and brain and are presented with differential topographical organizations in the brain.  相似文献   

14.
Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication.  相似文献   

15.
We present the results of a study testing the often-theorized role of musical expectations in inducing listeners’ emotions in a live flute concert experiment with 50 participants. Using an audience response system developed for this purpose, we measured subjective experience and peripheral psychophysiological changes continuously. To confirm the existence of the link between expectation and emotion, we used a threefold approach. (1) On the basis of an information-theoretic cognitive model, melodic pitch expectations were predicted by analyzing the musical stimuli used (six pieces of solo flute music). (2) A continuous rating scale was used by half of the audience to measure their experience of unexpectedness toward the music heard. (3) Emotional reactions were measured using a multicomponent approach: subjective feeling (valence and arousal rated continuously by the other half of the audience members), expressive behavior (facial EMG), and peripheral arousal (the latter two being measured in all 50 participants). Results confirmed the predicted relationship between high-information-content musical events, the violation of musical expectations (in corresponding ratings), and emotional reactions (psychologically and physiologically). Musical structures leading to expectation reactions were manifested in emotional reactions at different emotion component levels (increases in subjective arousal and autonomic nervous system activations). These results emphasize the role of musical structure in emotion induction, leading to a further understanding of the frequently experienced emotional effects of music.  相似文献   

16.
Functional neuroimaging and lesion-based neuropsychological experiments have demonstrated the human amygdala's role in recognition of certain emotions signaled by sensory stimuli, notably, fear and anger in facial expressions. We examined recognition of two emotional dimensions, arousal and valence, in a rare subject with complete, bilateral damage restricted to the amygdala. Recognition of emotional arousal was impaired for facial expressions, words, and sentences that depicted unpleasant emotions, especially in regard to fear and anger. However, recognition of emotional valence was normal. The findings suggest that the amygdala plays a critical role in knowledge concerning the arousal of negative emotions, a function that may explain the impaired recognition of fear and anger in patients with bilateral amygdala damage, and one that is consistent with the amygdala's role in processing stimuli related to threat and danger.  相似文献   

17.
The present study investigates the perception of facial expressions of emotion, and explores the relation between the configural properties of expressions and their subjective attribution. Stimuli were a male and a female series of morphed facial expressions, interpolated between prototypes of seven emotions (happiness, sadness, fear, anger, surprise and disgust, and neutral) from Ekman and Friesen (1976). Topographical properties of the stimuli were quantified using the Facial Expression Measurement (FACEM) scheme. Perceived dissimilarities between the emotional expressions were elicited using a sorting procedure and processed with multidimensional scaling. Four dimensions were retained in the reconstructed facial-expression space, with positive and negative expressions opposed along D1, while the other three dimensions were interpreted as affective attributes distinguishing clusters of expressions categorized as "Surprise-Fear," "Anger," and "Disgust." Significant relationships were found between these affective attributes and objective facial measures of the stimuli. The findings support a componential explanatory scheme for expression processing, wherein each component of a facial stimulus conveys an affective value separable from its context, rather than a categorical-gestalt scheme. The findings further suggest that configural information is closely involved in the decoding of affective attributes of facial expressions. Configural measures are also suggested as a common ground for dimensional as well as categorical perception of emotional faces.  相似文献   

18.
Facial stimuli are widely used in behavioural and brain science research to investigate emotional facial processing. However, some studies have demonstrated that dynamic expressions elicit stronger emotional responses compared to static images. To address the need for more ecologically valid and powerful facial emotional stimuli, we created Dynamic FACES, a database of morphed videos (n?=?1026) from younger, middle-aged, and older adults displaying naturalistic emotional facial expressions (neutrality, sadness, disgust, fear, anger, happiness). To assess adult age differences in emotion identification of dynamic stimuli and to provide normative ratings for this modified set of stimuli, healthy adults (n?=?1822, age range 18–86 years) categorised for each video the emotional expression displayed, rated the expression distinctiveness, estimated the age of the face model, and rated the naturalness of the expression. We found few age differences in emotion identification when using dynamic stimuli. Only for angry faces did older adults show lower levels of identification accuracy than younger adults. Further, older adults outperformed middle-aged adults’ in identification of sadness. The use of dynamic facial emotional stimuli has previously been limited, but Dynamic FACES provides a large database of high-resolution naturalistic, dynamic expressions across adulthood. Information on using Dynamic FACES for research purposes can be found at http://faces.mpib-berlin.mpg.de.  相似文献   

19.
Interpreting and responding to an infant's emotional cues is a fundamental parenting skill. Responsivity to infant cues is frequently disrupted in depression, impacting negatively on child outcomes, which underscores its importance. It is widely assumed that women, and in particular mothers, show greater attunement to infants than do men. However, empirical evidence for sex and parental status effects, particularly in relation to perception of infant emotion, has been lacking. In this study, men and women with and without young infants were asked to rate valence in a range of infant facial expressions, on a scale of very positive to very negative. Results suggested complex interaction effects between parental status, sex, and the facial expression being rated. Mothers provided more positive ratings of the happy expressions and more extreme ratings of the intense emotion expressions than fathers, but non-mothers and non-fathers did not. Low-level depressive symptoms were also found to correlate with more negative ratings of negative infant facial expressions across the entire sample. Overall, these results suggest that parental status might have differential effects on men and women's appraisal of infant cues. Differences between fathers’ and mothers’ perceptions of infant emotion might be of interest in understanding variance in interaction styles, such as proportion of time spent in play.  相似文献   

20.
Participants rated 84 statements adapted from Velten's original mood induction statements—designed to induce positive and negative mood—on two dimensions of emotion (valence and arousal), using the Self Assessment Manikin (SAM) (P. J. Lang, M. M. Bradley, & B. N. Cuthbert, 1999). Fifty-two of these Velten positive, negative, and neutral statements yielded SAM valence ratings that were consistent with Velten's previous valence designation (E. Velten, 1968). Reliability analyses for the positive, negative, and neutral statements indicated a high level of internal consistency in the three statement groups. Arousal and valence ratings of the statements were positively correlated. Related issues concerning differences in rating verbal versus visual emotional stimuli and recommendations for future work to improve the validity of Velten's mood induction statements are addressed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号