首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This study examined the effect of sense modality (auditory/visual) on emotional dampening (reduced responsiveness to emotions with elevation in blood pressure). Fifty‐six normotensive participants were assessed on tasks requiring labelling and matching of emotions in faces and voices. Based on median split of systolic and diastolic blood pressures (SBP and DBP, respectively), participants were divided into low BP, high BP and isolated BP groups. On emotion‐labelling tasks, analysis revealed reduced emotion recognition in the high BP than the low BP group. On emotion‐matching tasks, reduced emotion recognition was noted in high and also isolated BP group as compared to low BP group for the task that required matching a visual target with one of the four auditory distractors. Our findings show for the first time that even isolated elevations in either SBP or DBP may result in emotional dampening. Furthermore, the study highlights that the emotional dampening effect generalises to explicit processing (labelling) of emotional information in both faces and voices—and that these effects tentatively occur during more pragmatic and covert (matching) emotion recognition processes too. These findings require replication in clinical hypertensives.  相似文献   

2.
Autism spectrum conditions, among them Asperger Syndrome (AS), are generally described as entailing deficits in “cognitive empathy” or “theory of mind”. People with AS tend to have difficulties recognizing emotions, although the extent of these difficulties is still unclear. This study aimed to assess empathic profile in youth with AS (N = 38) and controls matched on age, sex and IQ. The study aimed to test if a dissociation between cognitive and affective empathy exists in AS. The study also aimed to explore emotion recognition in people with AS, and how it relates to emotional valence (positive, negative, and neutral emotions). The AS group scored lower than controls on cognitive empathy but scored within the average range on affective empathy. A deficit in emotion recognition was found in the AS group for positive emotions. These results confirm earlier findings in cognitive empathy and provide new insight about emotion recognition abilities in this population.  相似文献   

3.
Research on automatic attention to emotional faces offers mixed results. Therefore we examined validity effects for facial expressions of different emotions (compared to neutral faces) with a dot-probe paradigm in seven studies (total N?=?308). Systematic variations of type of emotion, CTI, task, cue size, and masking allow for a differentiated assessment of attentional capture by emotions and possible moderating factors. Results indicate a general absence of emotional validity effects as well as a lack of significant interactions with either of the manipulated factors, indicating that facial expressions of emotions do not capture attention in a fully automatic fashion. These findings suggest that situational and contextual factors have to be taken into account when investigating attentional capture of emotional faces.  相似文献   

4.
The aim of this study was to examine the moderating role of emotional awareness in the relationship between emotion regulation strategies and emotional information processing. A total of 120 female students regulated emotions while watching an unpleasant film. Before and after emotion induction, participants completed a set of tasks that required matching facial expressions. The results demonstrated that participants who were high in emotional awareness showed a significantly smaller increase in error responses (i.e., incorrect matches) than participants who were low in emotional awareness. However, this effect was observed only in suppression (i.e., inhibition of an emotionally expressive behavior), masking (i.e., emotion experienced with a happy expression) and control (i.e., no regulation) conditions. Among reappraisers, who were instructed to adopt a neutral attitude toward the film, regardless of whether they were high or low in emotional awareness, there was not a significant increase in error responses. This study shows that the potentially damaging impact of negative emotions on the processing of emotional information can be prevented by a high emotional awareness or with the implementation of reappraisal as an emotion regulation strategy.  相似文献   

5.
Past research has shown that children recognize emotions from facial expressions poorly and improve only gradually with age, but the stimuli in such studies have been static faces. Because dynamic faces include more information, it may well be that children more readily recognize emotions from dynamic facial expressions. The current study of children (N = 64, aged 5–10 years old) who freely labeled the emotion conveyed by static and dynamic facial expressions found no advantage of dynamic over static expressions; in fact, reliable differences favored static expressions. An alternative explanation of gradual improvement with age is that children's emotional categories change during development from a small number of broad emotion categories to a larger number of narrower categories—a pattern found here with both static and dynamic expressions.  相似文献   

6.
ABSTRACT

Previous research has found that individuals vary greatly in emotion differentiation, that is, the extent to which they distinguish between different emotions when reporting on their own feelings. Building on previous work that has shown that emotion differentiation is associated with individual differences in intrapersonal functions, the current study asks whether emotion differentiation is also related to interpersonal skills. Specifically, we examined whether individuals who are high in emotion differentiation would be more accurate in recognising others’ emotional expressions. We report two studies in which we used an established paradigm tapping negative emotion differentiation and several emotion recognition tasks. In Study 1 (N?=?363), we found that individuals high in emotion differentiation were more accurate in recognising others’ emotional facial expressions. Study 2 (N?=?217), replicated this finding using emotion recognition tasks with varying amounts of emotional information. These findings suggest that the knowledge we use to understand our own emotional experience also helps us understand the emotions of others.  相似文献   

7.
Anxiety is conceptualized as a state of negative emotional arousal that is accompanied by concern about future threat. The purpose of this meta-analytic review was to evaluate the evidence of associations between emotional competence and anxiety by examining how specific emotional competence domains (emotion recognition, emotion expression, emotion awareness, emotion understanding, acceptance of emotion, emotional self-efficacy, sympathetic/empathic responses to others’ emotions, recognition of how emotion communication and self-presentation affect relationships, and emotion regulatory processes) relate to anxiety in childhood and adolescence. A total of 185 studies were included in a series of meta-analyses (N’s ranged from 573 to 25,711). Results showed that anxious youth are less effective at expressing (r = ?0.15) and understanding emotions (r = ?0.20), less aware of (r = ?0.28) and less accepting of their own emotions (r = ?0.49), and report less emotional self-efficacy (r = ?0.36). More anxious children use more support-seeking coping strategies (r = 0.07) and are more likely to use less adaptive coping strategies including avoidant coping (r = 0.18), externalizing (r = 0.18), and maladaptive cognitive coping (r = 0.34). Emotion acceptance and awareness, emotional self-efficacy, and maladaptive cognitive coping yielded the largest effect sizes. Some effects varied with children’s age. The findings inform intervention and treatment programs of anxiety in youth and identify several areas for future research.  相似文献   

8.
Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.  相似文献   

9.
The present study investigated the potential protective role of components of emotion knowledge (i.e., emotion recognition, situation knowledge) in the links between young children's shyness and indices of socio‐emotional functioning. Participants were = 163 children (82 boys and 81 girls) aged 23–77 months (= 53.29, SD = 14.48), recruited from preschools in Italy. Parents provided ratings of child shyness and teachers rated children's socio‐emotional functioning at preschool (i.e., social competence, anxiety‐withdrawal, peer rejection). Children were also interviewed to assess their abilities to recognize facial emotional expressions and identify situations that affect emotions. Among the results, shyness was positively related to anxiety‐withdrawal and peer rejection. In addition, emotion recognition was found to significantly moderate the links between shyness and preschool socio‐emotional functioning, appearing to serve a buffering role. For example, at lower levels of emotion recognition, shyness was positively associated with both anxiety‐withdrawal and rejection by peers, but at higher levels of emotion recognition, these associations were attenuated. Results are discussed in terms of the protective role of emotion recognition in promoting shy children's positive socio‐emotional functioning within the classroom context.  相似文献   

10.
This paper addresses methodological considerations relevant to nonverbal communication of emotion research. In order to gather more information about the interpretations given to spontaneous and dynamic facial expressions, two main objectives guide the present exploratory research. The first one is to obtain naturalistic recordings of emotional expressions in realistic settings that are ‘emotional enough’. The second one is to address the issue of dynamic judgments of facial expressions of emotion, that is real-time emotional recognition. An innovative device has been created for this specific purpose. Results show that, although the social nature of the eliciting situation is minimal, the experience of some emotions is reflected on the encoders' faces while being covertly videotaped in natural conditions. Moreover, results show the utility to investigate dynamic emotional judgments of spontaneous and dynamic expressions since observers seem to be sensitive to the slightest facial expression change in making their emotional judgments. A promising paradigm is thus proposed for the study of the dynamics of real-time nonverbal emotional interaction. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

11.
This study addressed the degree to which adults' emotional states influence their perception of emotional states in children and their motivation to change such states. Happiness, sadness, anger, or a neutral state was induced in adults, who then viewed slides of 4-year-old children who were actually experiencing various emotional states. Adults' own emotional states had little impact on their accurate recognition of children's emotions or on their motives for social action to change such emotions. However, adults' states did influence the intensity they assigned to children's emotions, with happy adults tending to rate some emotions as more intense for black children (sadness) and for girls (anger and neutrality). The base rates with which adults used different emotion labels also influenced judgments, increasing it for the recognition of happiness and reducing it for anger. The results are discussed in terms of the factors that influence whether or not emotional states affect judgment processes and the role of emotion labels in the effective recognition of ongoing emotional states. Also addressed is the consequence of adults' recognition of emotion in children for the effective socialization of emotion.  相似文献   

12.
Previous research suggests that labelling emotions, or describing affective states using emotion words, facilitates emotion regulation. But how much labelling promotes emotion regulation? And which emotion regulation strategies does emotion labelling promote? Drawing on cognitive theories of emotion, we predicted that labelling emotions using fewer words would be less confusing and would facilitate forms of emotion regulation requiring more cognitively demanding processing of context. Participants (N?=?82) mentally immersed themselves in an emotional vignette, were randomly assigned to an exhaustive or minimal emotion labelling manipulation, and then completed an emotion regulation strategy planning task. Minimal (vs. exhaustive) emotion labelling promoted higher subjective emotional clarity. Furthermore, in terms of specific emotion regulation strategies, minimal emotion labelling prompted more plans for problem solving and marginally more plans for reappraisal, but did not affect plans for behavioural activation or social support seeking. We discuss implications for the cognitive mechanisms supporting the generation of emotion regulation strategies.  相似文献   

13.
The ability to recognize emotions from others’ nonverbal behavior (emotion recognition ability, ERA) is crucial to successful social functioning. However, currently no self-administered ERA training for non-clinical adults covering multiple sensory channels exists. We conducted four studies in a lifespan sample of participants in the laboratory and online (total N?=?531) to examine the effectiveness of a short computer-based training for 14 different emotions using audiovisual clips of emotional expressions. Results showed that overall, young and middle-aged participants that had received the training scored significantly higher on facial, vocal, and audiovisual emotion recognition than the control groups. The training effect for audiovisual ERA persisted over 4 weeks. In older adults (59–90 years), however, the training had no effect. The new, brief training could be useful in applied settings such as professional training, at least for younger and middle-aged adults. In older adults, improving ERA might require a longer and more interactive intervention.  相似文献   

14.
The study investigates cross-modal simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs), using a wide range of different emotions (happiness, sadness, fear, anger, surprise, and disgust). Auditory emotional stimuli (a neutral word pronounced in an affective tone) and visual patterns (emotional facial expressions) were matched in congruous (the same emotion in face and voice) and incongruous (different emotions) pairs. Subjects (N=31) were required to watch and listen to the stimuli in order to comprehend them. Repeated measures ANOVAs showed a positive ERP deflection (P2), more posterior distributed. This P2 effect may represent a marker of cross-modal integration, modulated as a function of congruous/incongruous condition. Indeed, it shows an ampler peak in response to congruous stimuli than incongruous ones. It is suggested P2 can be a cognitive marker of multisensory processing, independently from the emotional content.  相似文献   

15.
The recognition of nonverbal emotional signals and the integration of multimodal emotional information are essential for successful social communication among humans of any age. Whereas prior studies of age dependency in the recognition of emotion often focused on either the prosodic or the facial aspect of nonverbal signals, our purpose was to create a more naturalistic setting by presenting dynamic stimuli under three experimental conditions: auditory, visual, and audiovisual. Eighty-four healthy participants (women = 44, men = 40; age range 20-70 years) were tested for their abilities to recognize emotions either mono- or bimodally on the basis of emotional (happy, alluring, angry, disgusted) and neutral nonverbal stimuli from voice and face. Additionally, we assessed visual and auditory acuity, working memory, verbal intelligence, and emotional intelligence to explore potential explanatory effects of these population parameters on the relationship between age and emotion recognition. Applying unbiased hit rates as performance measure, we analyzed data with linear regression analyses, t tests, and with mediation analyses. We found a linear, age-related decrease in emotion recognition independent of stimulus modality and emotional category. In contrast, the improvement in recognition rates associated with audiovisual integration of bimodal stimuli seems to be maintained over the life span. The reduction in emotion recognition ability at an older age could not be sufficiently explained by age-related decreases in hearing, vision, working memory, and verbal intelligence. These findings suggest alterations in social perception at a level of complexity beyond basic perceptional and cognitive abilities.  相似文献   

16.
The present study utilized a short‐term longitudinal research design to examine the hypothesis that shyness in preschoolers is differentially related to different aspects of emotion processing. Using teacher reports of shyness and performance measures of emotion processing, including (1) facial emotion recognition, (2) non‐facial emotion recognition, and (3) emotional perspective‐taking, we examined 337 Head Start attendees twice at a 24‐week interval. Results revealed significant concurrent and longitudinal relationships between shyness and facial emotion recognition, and either minimal or non‐existent relationships between shyness and the other aspects of emotion processing. Correlational analyses of concurrent assessments revealed that shyness predicted poorer facial emotion recognition scores for negative emotions (sad, angry, and afraid), but not a positive emotion (happy). Analyses of change over time, on the other hand, revealed that shyness predicted change in facial emotion recognition scores for all four measured emotions. Facial emotion recognition scores did not predict changes in shyness. Results are discussed with respect to expanding the scope of research on shyness and emotion processing to include time‐dependent studies that allow for the specification of developmental processes. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

17.
Which brain regions are associated with recognition of emotional prosody? Are these distinct from those for recognition of facial expression? These issues were investigated by mapping the overlaps of co-registered lesions from 66 brain-damaged participants as a function of their performance in rating basic emotions. It was found that recognizing emotions from prosody draws on the right frontoparietal operculum, the bilateral frontal pole, and the left frontal operculum. Recognizing emotions from prosody and facial expressions draws on the right frontoparietal cortex, which may be important in reconstructing aspects of the emotion signaled by the stimulus. Furthermore, there were regions in the left and right temporal lobes that contributed disproportionately to recognition of emotion from faces or prosody, respectively.  相似文献   

18.
We compare matching of facial expressions of emotion, completion of the positive valence of emotional expression, attunement of emotional intensity, and non-matching of emotion, in engagements with their mothers of firstborn dizygotic twins and of singletons. Nine twins and nine singletons were video-recorded at home in spontaneous face-to-face interactions from the second to the sixth month after birth. Microanalysis of infant and maternal facial expressions of emotion revealed qualitative and quantitative differences that indicate that engagements with twins had more frequent and more accurate emotional matching and attunements compared to those with singletons. Singletons displayed more emotional completion and non-matching reactions. Expressions of matching for pleasure and interest followed different developmental patterns in the two kinds of dyads. These results are discussed in relation to the theory of innate affective intersubjectivity. Differences may shed light on the relationship between sharing early life with a twin, and development of self-other awareness.  相似文献   

19.
Few studies have examined potential differences between social anxiety disorder (SAD) and generalised anxiety disorder (GAD) in the sensitivity to detect emotional expressions. The present study aims to compare the detection of emotional expressions in SAD and GAD. Participants with a primary diagnosis of GAD (n?=?46), SAD (n?=?70), and controls (n?=?118) completed a morph movies task. The task presented faces expressing increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expressions. Participants used a slide bar to view the movie frames from left to right, and to stop at the first frame where they perceived an emotion. The frame selected thus indicated the intensity of emotion required to identify the facial expression. Participants with GAD detected the onset of facial emotions at lower intensity of emotion than participants with SAD (p?=?0.002) and controls (p?=?0.039). In a multiple regression analysis controlling for age, race, and depressive symptom severity, lower frame at which the emotion was detected was independently associated and GAD diagnosis (B?=?–5.73, SE?=?1.74, p?相似文献   

20.
Our facial expressions give others the opportunity to access our feelings, and constitute an important nonverbal tool for communication. Many recent studies have investigated emotional perception in adults, and our knowledge of neural processes involved in emotions is increasingly precise. Young children also use faces to express their internal states and perceive emotions in others, but little is known about the neurodevelopment of expression recognition. The goal of the current study was to determine the normal development of facial emotion perception. We recorded ERPs in 82 children 4 to 15 years of age during an implicit processing task with emotional faces. Task and stimuli were the same as those used and validated in an adult study; we focused on the components that showed sensitivity to emotions in adults (P1, N170 and frontal slow wave). An effect of the emotion expressed by faces was seen on the P1 in the youngest children. With increasing age this effect disappeared while an emotional sensitivity emerged on N170. Early emotional processing in young children differed from that observed in the adolescents, who approached adults. In contrast, the later frontal slow wave, although showing typical age effects, was more positive for neutral and happy faces across age groups. Thus, despite the precocious utilization of facial emotions, the neural processing involved in the perception of emotional faces develops in a staggered fashion throughout childhood, with the adult pattern appearing only late in adolescence.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号