首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Psychological factors are known to play an important part in the origin of many medical conditions including hypertension. Recent studies have reported elevated blood pressure (even in the normal range of variation) to be associated with a reduced responsiveness to emotions or ‘emotional dampening’. Our aim was to assess emotional dampening in individuals with more extreme blood pressure levels including prehypertensives (N = 58) and hypertensives (N = 60) by comparing their emotion recognition ability with normotensives (N = 57). Participants completed novel facial emotion matching and facial emotion labelling tasks following blood pressure measurement and their accuracy of emotion recognition and average response times were compared. The normotensives demonstrated a significantly higher accuracy of emotion recognition than the prehypertensives and the hypertensives in labelling of facial emotions. This difference generalised to the task where two facial halves (upper & lower) had to be matched on the basis of emotions. In neither the labelling nor matching emotion conditions did the groups differ in their speed of emotion processing. Findings of the present study extend reports of ‘emotional dampening’ to hypertensives as well as to those at-risk for developing hypertension (i.e. prehypertensives) and have important implications for understanding the psychological component of such medical conditions as hypertension.  相似文献   

2.
Efficient navigation of our social world depends on the generation, interpretation, and combination of social signals within different sensory systems. However, the influence of healthy adult aging on multisensory integration of emotional stimuli remains poorly explored. This article comprises 2 studies that directly address issues of age differences on cross-modal emotional matching and explicit identification. The first study compared 25 younger adults (19-40 years) and 25 older adults (60-80 years) on their ability to match cross-modal congruent and incongruent emotional stimuli. The second study looked at performance of 20 younger (19-40) and 20 older adults (60-80) on explicit emotion identification when information was presented congruently in faces and voices or only in faces or in voices. In Study 1, older adults performed as well as younger adults on tasks in which congruent auditory and visual emotional information were presented concurrently, but there were age-related differences in matching incongruent cross-modal information. Results from Study 2 indicated that though older adults were impaired at identifying emotions from 1 modality (faces or voices alone), they benefited from congruent multisensory information as age differences were eliminated. The findings are discussed in relation to social, emotional, and cognitive changes with age.  相似文献   

3.
The voice is a marker of a person's identity which allows individual recognition even if the person is not in sight. Listening to a voice also affords inferences about the speaker's emotional state. Both these types of personal information are encoded in characteristic acoustic feature patterns analyzed within the auditory cortex. In the present study 16 volunteers listened to pairs of non-verbal voice stimuli with happy or sad valence in two different task conditions while event-related brain potentials (ERPs) were recorded. In an emotion matching task, participants indicated whether the expressed emotion of a target voice was congruent or incongruent with that of a (preceding) prime voice. In an identity matching task, participants indicated whether or not the prime and target voice belonged to the same person. Effects based on emotion expressed occurred earlier than those based on voice identity. Specifically, P2 (approximately 200 ms)-amplitudes were reduced for happy voices when primed by happy voices. Identity match effects, by contrast, did not start until around 300 ms. These results show an early task-specific emotion-based influence on the early stages of auditory sensory processing.  相似文献   

4.
The most familiar emotional signals consist of faces, voices, and whole-body expressions, but so far research on emotions expressed by the whole body is sparse. The authors investigated recognition of whole-body expressions of emotion in three experiments. In the first experiment, participants performed a body expression-matching task. Results indicate good recognition of all emotions, with fear being the hardest to recognize. In the second experiment, two alternative forced choice categorizations of the facial expression of a compound face-body stimulus were strongly influenced by the bodily expression. This effect was a function of the ambiguity of the facial expression. In the third experiment, recognition of emotional tone of voice was similarly influenced by task irrelevant emotional body expressions. Taken together, the findings illustrate the importance of emotional whole-body expressions in communication either when viewed on their own or, as is often the case in realistic circumstances, in combination with facial expressions and emotional voices.  相似文献   

5.
Three age groups of participants (6–8 years, 9–11 years, adults) performed two tasks: A face recognition task and a Garner task. In the face recognition task, the participants were presented with 20 faces and then had to recognize them among 20 new faces. In the Garner tasks, the participants had to sort, as fast as possible, the photographs of two persons expressing two emotions by taking into account only one of the two dimensions (identity or emotion). When the sorting task was on one dimension, the other dimension was varied either in a correlated, a constant or an orthogonal way in distinct subsessions. The results indicated an increase in face recognition abilities. They also showed an interference of identity in the emotion-sorting task that was similar in the three age groups. Nevertheless, an interference of emotion in the identity-sorting task was significant only for the children and was more important for the youngest group. These observations suggest that the development in face recognition ability rests on the development of the ability to attend selectively to identity, without paying attention to emotional facial expression.  相似文献   

6.
The present study aimed to quantify the magnitude of sex differences in humans' ability to accurately recognise non-verbal emotional displays. Studies of relevance were those that required explicit labelling of discrete emotions presented in the visual and/or auditory modality. A final set of 551 effect sizes from 215 samples was included in a multilevel meta-analysis. The results showed a small overall advantage in favour of females on emotion recognition tasks (d = 0.19). However, the magnitude of that sex difference was moderated by several factors, namely specific emotion, emotion type (negative, positive), sex of the actor, sensory modality (visual, audio, audio-visual) and age of the participants. Method of presentation (computer, slides, print, etc.), type of measurement (response time, accuracy) and year of publication did not significantly contribute to variance in effect sizes. These findings are discussed in the context of social and biological explanations of sex differences in emotion recognition.  相似文献   

7.
Increasing evidence suggests that facial emotion recognition is impaired in bipolar disorder (BD). However, patient–control differences are small owing to ceiling effects on the tasks used to assess them. The extant literature is also limited by a relative absence of attention towards identifying patterns of emotion misattribution or understanding whether neutral faces are mislabelled in the same way as ones displaying emotion. We addressed these limitations by comparing facial emotion recognition performance in BD patients and healthy controls on a novel and challenging task. Thirty-four outpatients with BD I and 32 demographically matched healthy controls completed a facial emotion recognition task requiring the labelling of neutral and emotive faces displayed at low emotional intensities. Results indicated that BD patients were significantly less accurate at labelling faces than healthy controls, particularly if they displayed fear or neutral expressions. There were no between-group differences in response times or patterns of emotion mislabelling, with both groups confusing sad and neutral faces, although BD patients also mislabelled sad faces as angry. Task performance did not significantly correlate with mood symptom severity in the BD group. These findings suggest that facial emotion recognition impairments in BD extend to neutral face recognition. Emotion misattribution occurs in a similar, albeit exaggerated manner in patients with BD compared to healthy controls. Future behavioural and neuroimaging research should reconsider the use of neutral faces as baseline stimuli in their task designs.  相似文献   

8.
We used the remember-know procedure (Tulving, 1985 ) to test the behavioural expression of memory following indirect and direct forms of emotional processing at encoding. Participants (N=32) viewed a series of facial expressions (happy, fearful, angry, and neutral) while performing tasks involving either indirect (gender discrimination) or direct (emotion discrimination) emotion processing. After a delay, participants completed a surprise recognition memory test. Our results revealed that indirect encoding of emotion produced enhanced memory for fearful faces whereas direct encoding of emotion produced enhanced memory for angry faces. In contrast, happy faces were better remembered than neutral faces after both indirect and direct encoding tasks. These findings suggest that fearful and angry faces benefit from a recollective advantage when they are encoded in a way that is consistent with the predictive nature of their threat. We propose that the broad memory advantage for happy faces may reflect a form of cognitive flexibility that is specific to positive emotions.  相似文献   

9.
This study of the presence of alexithymic characteristics in obese adolescents and preadolescents tested the hypothesis of whether they showed impaired recognition and expression of emotion. The sample included 30 obese young participants and a control group of 30 participants of normal weight for their ages. Stimuli, 42 faces representing seven emotional expressions, were shown to participants who identified the emotion expressed in the face. The Level of Emotional Awareness Scale was adapted for children to evaluate their ability to describe their emotions. Young obese participants had significantly lower scores than control participants, but no differences were found in recognition of emotion. The lack of words to describe emotions might suggest a greater prevalence of alexithymic characteristics in the obese participants, but the hypothesis of a general deficit in the processing of emotional experiences was not supported.  相似文献   

10.
Facial emotions are important for human communication. Unfortunately, traditional facial emotion recognition tasks do not inform about how respondents might behave towards others expressing certain emotions. Approach‐avoidance tasks do measure behaviour, but only on one dimension. In this study 81 participants completed a novel Facial Emotion Response Task. Images displaying individuals with emotional expressions were presented in random order. Participants simultaneously indicated how communal (quarrelsome vs. agreeable) and how agentic (dominant vs. submissive) they would be in response to each expression. We found that participants responded differently to happy, angry, fearful, and sad expressions in terms of both dimensions of behaviour. Higher levels of negative affect were associated with less agreeable responses specifically towards happy and sad expressions. The Facial Emotion Response Task might complement existing facial emotion recognition and approach‐avoidance tasks.  相似文献   

11.
Affective computing research has advanced emotion recognition systems using facial expressions, voices, gaits, and physiological signals, yet these methods are often impractical. This study integrates mouse cursor motion analysis into affective computing and investigates the idea that movements of the computer cursor can provide information about emotion of the computer user. We extracted 16–26 trajectory features during a choice‐reaching task and examined the link between emotion and cursor motions. Participants were induced for positive or negative emotions by music, film clips, or emotional pictures, and they indicated their emotions with questionnaires. Our 10‐fold cross‐validation analysis shows that statistical models formed from “known” participants (training data) could predict nearly 10%–20% of the variance of positive affect and attentiveness ratings of “unknown” participants, suggesting that cursor movement patterns such as the area under curve and direction change help infer emotions of computer users.  相似文献   

12.
The aim of this study was to examine the moderating role of emotional awareness in the relationship between emotion regulation strategies and emotional information processing. A total of 120 female students regulated emotions while watching an unpleasant film. Before and after emotion induction, participants completed a set of tasks that required matching facial expressions. The results demonstrated that participants who were high in emotional awareness showed a significantly smaller increase in error responses (i.e., incorrect matches) than participants who were low in emotional awareness. However, this effect was observed only in suppression (i.e., inhibition of an emotionally expressive behavior), masking (i.e., emotion experienced with a happy expression) and control (i.e., no regulation) conditions. Among reappraisers, who were instructed to adopt a neutral attitude toward the film, regardless of whether they were high or low in emotional awareness, there was not a significant increase in error responses. This study shows that the potentially damaging impact of negative emotions on the processing of emotional information can be prevented by a high emotional awareness or with the implementation of reappraisal as an emotion regulation strategy.  相似文献   

13.
Children who are able to recognize others' emotions are successful in a variety of socioemotional domains, yet we know little about how school‐aged children's abilities develop, particularly in the family context. We hypothesized that children develop emotion recognition skill as a function of parents' own emotion‐related beliefs, behaviours, and skills. We examined parents' beliefs about the value of emotion and guidance of children's emotion, parents' emotion labelling and teaching behaviours, and parents' skill in recognizing children's emotions in relation to their school‐aged children's emotion recognition skills. Sixty‐nine parent–child dyads completed questionnaires, participated in dyadic laboratory tasks, and identified their own emotions and emotions felt by the other participant from videotaped segments. Regression analyses indicate that parents' beliefs, behaviours, and skills together account for 37% of the variance in child emotion recognition ability, even after controlling for parent and child expressive clarity. The findings suggest the importance of the family milieu in the development of children's emotion recognition skill in middle childhood and add to accumulating evidence suggesting important age‐related shifts in the relation between parental emotion socialization and child emotional development. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

14.
Healthy normotensive men and women (N=33) underwent a 60-day diary assessment of emotions and cardiovascular functioning. Individual differences in social connectedness and mood were measured in questionnaires, and positive emotions, negative emotions, systolic blood pressure (SBP), and diastolic blood pressure (DBP) were assessed daily for 60 consecutive days. Results confirmed that the cardiovascular undoing effect of positive emotions is evident primarily in the context of negative emotional arousal. The daily associations between positive emotions and cardiovascular outcomes were linked to individual differences in social connectedness. Controlling for individual differences in mood levels, multilevel regression analyses showed that social connectedness predicted extended positive emotion, diminished SBP and DBP reactivity, and more rapid SBP recovery from daily negative emotional states.  相似文献   

15.
《Behavior Therapy》2022,53(2):182-195
Eye-tracking-based attention research has consistently shown a lack of a normative attentional bias away from dysphoric face stimuli in depression, characterizing the attention system of non-depressed individuals. However, this more equal attention allocation pattern could also be related to biased emotion identification, namely, an inclination of depressed individuals to attribute negative emotions to non-negative stimuli when processing mood-congruent stimuli. Here, we examined emotion identification as a possible mechanism associated with attention allocation when processing emotional faces in depression. Attention allocation and emotion identification of participants with high (HD; n = 30) and low (LD; n = 30) levels of depression symptoms were assessed using two corresponding tasks previously shown to yield significant findings in depression, using the same face stimuli (sad, happy, and neutral faces) across both tasks. We examined group differences on each task and possible between-task associations. Results showed that while LD participants dwelled longer on relatively positive faces compared with relatively negative faces on the attention allocation task, HD participants showed no such bias, dwelling equally on both. Trait anxiety did not affect these results. No group differences were noted for emotion identification, and no between-task associations emerged. Present results suggest that depression is characterized by a lack of a general attention bias toward relatively positive faces over relatively negative faces, which is not related to a corresponding bias in emotion identification.  相似文献   

16.
Research suggests that infants progress from discrimination to recognition of emotions in faces during the first half year of life. It is unknown whether the perception of emotions from bodies develops in a similar manner. In the current study, when presented with happy and angry body videos and voices, 5-month-olds looked longer at the matching video when they were presented upright but not when they were inverted. In contrast, 3.5-month-olds failed to match even with upright videos. Thus, 5-month-olds but not 3.5-month-olds exhibited evidence of recognition of emotions from bodies by demonstrating intermodal matching. In a subsequent experiment, younger infants did discriminate between body emotion videos but failed to exhibit an inversion effect, suggesting that discrimination may be based on low-level stimulus features. These results document a developmental change from discrimination based on non-emotional information at 3.5 months to recognition of body emotions at 5 months. This pattern of development is similar to face emotion knowledge development and suggests that both the face and body emotion perception systems develop rapidly during the first half year of life.  相似文献   

17.
Background: Difficulties with social function have been reported in chronic fatigue syndrome (CFS), but underpinning factors are unknown. Emotion recognition, theory of mind (inference of another's mental state) and ‘emotional’ theory of mind (eToM) (inference of another's emotional state) are important social abilities, facilitating understanding of others. This study examined emotion recognition and eToM in CFS patients and their relationship to self-reported social function.

Methods: CFS patients (n?=?45) and healthy controls (HCs; n?=?50) completed tasks assessing emotion recognition, basic or advanced eToM (for self and other) and a self-report measure of social function.

Results: CFS participants were poorer than HCs at recognising emotion states in the faces of others and at inferring their own emotions. Lower scores on these tasks were associated with poorer self-reported daily and social function. CFS patients demonstrated good eToM and performance on these tasks did not relate to the level of social function.

Conclusions: CFS patients do not have poor eToM, nor does eToM appear to be associated with social functioning in CFS. However, this group of patients experience difficulties in emotion recognition and inferring emotions in themselves and this may impact upon social function.  相似文献   

18.
Ecological momentary assessment (EMA ) methodology was used to examine the emotional context of nonsuicidal self‐injury (NSSI). Forty‐seven adolescents and young adults used a novel smartphone app to monitor their emotional experiences, NSSI thoughts, and NSSI behaviors for 2 weeks. Momentary changes in both negative and positive emotions predicted greater intensity of NSSI thoughts at the subsequent assessment, while only increases in negative emotion predicted NSSI behaviors. Immediately following NSSI behaviors participants reported reduced high‐arousal negative emotions and increased low‐arousal positive emotions, suggesting that NSSI may be an efficient and effective method of regulating emotion. Findings highlight the importance of addressing emotion regulation in NSSI interventions.  相似文献   

19.
Background/objectiveIndividuals with broad autism phenotype (BAP) showed a diminished ability to recognize emotion. This study aims to examine whether their decline in emotion recognition ability could be more clearly identified as task complexity increased and whether their decline could be influenced by their eye-gaze patterns.Method41 individuals with BAP and 40 healthy controls performed two types of emotion recognition tasks. After confirming conditions wherein the BAP group did not perform well compared to the control group, we compared gaze proportion on faces and context between groups when performing the conditions.ResultsThe more difficult the task, the clearer the significant relationships between the level of autistic traits and emotion recognition ability. The BAP group showed lower accuracy compared to the control group when a face with mild emotional intensity was presented with context. In terms of gaze proportion, the BAP group looked less at faces when recognizing emotions compared to the control group.ConclusionThese findings indicate that diminished emotion recognition ability in individuals with BAP may be influenced by face gaze.  相似文献   

20.
Sensitivity to facial and vocal emotion is fundamental to children's social competence. Previous research has focused on children's facial emotion recognition, and few studies have investigated non‐linguistic vocal emotion processing in childhood. We compared facial and vocal emotion recognition and processing biases in 4‐ to 11‐year‐olds and adults. Eighty‐eight 4‐ to 11‐year‐olds and 21 adults participated. Participants viewed/listened to faces and voices (angry, happy, and sad) at three intensity levels (50%, 75%, and 100%). Non‐linguistic tones were used. For each modality, participants completed an emotion identification task. Accuracy and bias for each emotion and modality were compared across 4‐ to 5‐, 6‐ to 9‐ and 10‐ to 11‐year‐olds and adults. The results showed that children's emotion recognition improved with age; preschoolers were less accurate than other groups. Facial emotion recognition reached adult levels by 11 years, whereas vocal emotion recognition continued to develop in late childhood. Response bias decreased with age. For both modalities, sadness recognition was delayed across development relative to anger and happiness. The results demonstrate that developmental trajectories of emotion processing differ as a function of emotion type and stimulus modality. In addition, vocal emotion processing showed a more protracted developmental trajectory, compared to facial emotion processing. The results have important implications for programmes aiming to improve children's socio‐emotional competence.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号