首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 218 毫秒
1.
The aim was to establish if the memory bias for sad faces, reported in clinically depressed patients (Gilboa-Schechtman, Erhard Weiss, & Jeczemien, 2002; Ridout, Astell, Reid, Glen, & O'Carroll, 2003) generalises to sub-clinical depression (dysphoria) and experimentally induced sadness. Study 1: dysphoric (n = 24) and non-dysphoric (n = 20) participants were presented with facial stimuli, asked to identify the emotion portrayed and then given a recognition memory test for these faces. At encoding, dysphoric participants (DP) exhibited impaired identification of sadness and neutral affect relative to the non-dysphoric group (ND). At memory testing, DP exhibited superior memory for sad faces relative to happy and neutral. They also exhibited enhanced memory for sad faces and impaired memory for happy relative to the ND. Study 2: non-depressed participants underwent a positive (n = 24) or negative (n = 24) mood induction (MI) and were assessed on the same tests as Study 1. At encoding, negative MI participants showed superior identification of sadness, relative to neutral affect and compared to the positive MI group. At memory testing, the negative MI group exhibited enhanced memory for the sad faces relative to happy or neutral and compared to the positive MI group. Conclusion: MCM bias for sad faces generalises from clinical depression to these sub-clinical affective states.  相似文献   

2.
Attentional biases for negative interpersonal stimuli in clinical depression   总被引:14,自引:0,他引:14  
An information-processing paradigm was used to examine attentional biases in clinically depressed participants, participants with generalized anxiety disorder (GAD), and nonpsychiatric control participants for faces expressing sadness, anger, and happiness. Faces were presented for 1000 ms, at which point depressed participants had directed their attention selectively to depression-relevant (i.e., sad) faces. This attentional bias was specific to the emotion of sadness; the depressed participants did not exhibit attentional biases to the angry or happy faces. This bias was also specific to depression; at 1000 ms, participants with GAD were not attending selectively to sad, happy, or anxiety-relevant (i.e., angry) faces. Implications of these findings for both the cognitive and the interpersonal functioning of depressed individuals are discussed and directions for future research are advanced.  相似文献   

3.
Older adults perceive less intense negative emotion in facial expressions compared to younger counterparts. Prior research has also demonstrated that mood alters facial emotion perception. Nevertheless, there is little evidence which evaluates the interactive effects of age and mood on emotion perception. This study investigated the effects of sad mood on younger and older adults’ perception of emotional and neutral faces. Participants rated the intensity of stimuli while listening to sad music and in silence. Measures of mood were administered. Younger and older participants’ rated sad faces as displaying stronger sadness when they experienced sad mood. While younger participants showed no influence of sad mood on happiness ratings of happy faces, older adults rated happy faces as conveying less happiness when they experienced sad mood. This study demonstrates how emotion perception can change when a controlled mood induction procedure is applied to alter mood in young and older participants.  相似文献   

4.
Do people always interpret a facial expression as communicating a single emotion (e.g., the anger face as only angry) or is that interpretation malleable? The current study investigated preschoolers' (N = 60; 3-4 years) and adults' (N = 20) categorization of facial expressions. On each of five trials, participants selected from an array of 10 facial expressions (an open-mouthed, high arousal expression and a closed-mouthed, low arousal expression each for happiness, sadness, anger, fear, and disgust) all those that displayed the target emotion. Children's interpretation of facial expressions was malleable: 48% of children who selected the fear, anger, sadness, and disgust faces for the "correct" category also selected these same faces for another emotion category; 47% of adults did so for the sadness and disgust faces. The emotion children and adults attribute to facial expressions is influenced by the emotion category for which they are looking. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

5.
The goal of this research was to examine the effects of facial expressions on the speed of sex recognition. Prior research revealed that sex recognition of female angry faces was slower compared with male angry faces and that female happy faces are recognized faster than male happy faces. We aimed to replicate and extend the previous research by using different set of facial stimuli, different methodological approach and also by examining the effects of some other previously unexplored expressions (such as crying) on the speed of sex recognition. In the first experiment, we presented facial stimuli of men and women displaying anger, fear, happiness, sadness, crying and three control conditions expressing no emotion. Results showed that sex recognition of angry females was significantly slower compared with sex recognition in any other condition, while sad, crying, happy, frightened and neutral expressions did not impact the speed of sex recognition. In the second experiment, we presented angry, neutral and crying expressions in blocks and again only sex recognition of female angry expressions was slower compared with all other expressions. The results are discussed in a context of perceptive features of male and female facial configuration, evolutionary theory and social learning context.  相似文献   

6.
This study investigated how target sex, target age, and expressive ambiguity influence emotion perception. Undergraduate participants (N = 192) watched morphed video clips of eight child and eight adult facial expressions shifting from neutral to either sadness or anger. Participants were asked to stop the video clip when they first saw an emotion appear (perceptual sensitivity) and were asked to identify the emotion that they saw (accuracy). Results indicate that female participants identified sad expressions sooner in female targets than in male targets. Participants were also more accurate identifying angry facial expressions by male children than by female children. Findings are discussed in terms of the effects of ambiguity, gender, and age on the perception of emotional expressions.  相似文献   

7.
While the recognition of emotional expressions has been extensively studied, the behavioural response to these expressions has not. In the interpersonal circumplex, behaviour is defined in terms of communion and agency. In this study, we examined behavioural responses to both facial and postural expressions of emotion. We presented 101 Romanian students with facial and postural stimuli involving individuals (‘targets’) expressing happiness, sadness, anger, or fear. Using an interpersonal grid, participants simultaneously indicated how communal (i.e., quarrelsome or agreeable) and agentic (i.e., dominant or submissive) they would be towards people displaying these expressions. Participants were agreeable‐dominant towards targets showing happy facial expressions and primarily quarrelsome towards targets with angry or fearful facial expressions. Responses to targets showing sad facial expressions were neutral on both dimensions of interpersonal behaviour. Postural versus facial expressions of happiness and anger elicited similar behavioural responses. Participants responded in a quarrelsome‐submissive way to fearful postural expressions and in an agreeable way to sad postural expressions. Behavioural responses to the various facial expressions were largely comparable to those previously observed in Dutch students. Observed differences may be explained from participants’ cultural background. Responses to the postural expressions largely matched responses to the facial expressions.  相似文献   

8.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression.  相似文献   

9.
We used the remember-know procedure (Tulving, 1985 ) to test the behavioural expression of memory following indirect and direct forms of emotional processing at encoding. Participants (N=32) viewed a series of facial expressions (happy, fearful, angry, and neutral) while performing tasks involving either indirect (gender discrimination) or direct (emotion discrimination) emotion processing. After a delay, participants completed a surprise recognition memory test. Our results revealed that indirect encoding of emotion produced enhanced memory for fearful faces whereas direct encoding of emotion produced enhanced memory for angry faces. In contrast, happy faces were better remembered than neutral faces after both indirect and direct encoding tasks. These findings suggest that fearful and angry faces benefit from a recollective advantage when they are encoded in a way that is consistent with the predictive nature of their threat. We propose that the broad memory advantage for happy faces may reflect a form of cognitive flexibility that is specific to positive emotions.  相似文献   

10.
Visual working memory (WM) for face identities is enhanced when faces express negative versus positive emotion. To determine the stage at which emotion exerts its influence on memory for person information, we isolated expression (angry/happy) to the encoding phase (Experiment 1; neutral test faces) or retrieval phase (Experiment 2; neutral study faces). WM was only enhanced by anger when expression was present at encoding, suggesting that retrieval mechanisms are not influenced by emotional expression. To examine whether emotional information is discarded on completion of encoding or sustained in WM, in Experiment 3 an emotional word categorisation task was inserted into the maintenance interval. Emotional congruence between word and face supported memory for angry but not for happy faces, suggesting that negative emotional information is preferentially sustained during WM maintenance. Our findings demonstrate that negative expressions exert sustained and beneficial effects on WM for faces that extend beyond encoding.  相似文献   

11.
The aim of the present study was to establish if patients with major depression (MD) exhibit a memory bias for sad faces, relative to happy and neutral, when the affective element of the faces is not explicitly processed at encoding. To this end, 16 psychiatric out-patients with MD and 18 healthy, never-depressed controls (HC) were presented with a series of emotional faces and were required to identify the gender of the individuals featured in the photographs. Participants were subsequently given a recognition memory test for these faces. At encoding, patients with MD exhibited a non-significant tendency towards slower gender identification (GI) times, relative to HC, for happy faces. However, the GI times of the two groups did not differ for sad or neutral faces. At memory testing, patients with MD did not exhibit the expected memory bias for sad faces. Similarly, HC did not demonstrate enhanced memory for happy faces. Overall, patients with MD were impaired in their memory for the faces relative to the HC. The current findings are consistent with the proposal that mood-congruent memory biases are contingent upon explicit processing of the emotional element of the to-be-remembered material at encoding.  相似文献   

12.
We investigated the effects of psychopathy on emotional memory among a predominantly female undergraduate sample. Undergraduates (= 153, mean age = 20.1; 80.1% female; 57.1% Caucasian) completed a facial memory task. Participants were presented with a series of faces (sad, scared, angry, happy, neutral), completed a self-report measure of psychopathy, and were presented with another series of faces (with some from the first phase, and some new). Participants were asked whether they recognized each face from the first set, and reaction time (RT) was measured. Although memory for emotional faces did not differ from neutral faces, there were main effects of emotion, gender and psychopathy on RT. A significant 3-way interaction revealed that males who were higher in psychopathy had slower RTs; they were slow to remember sad, angry and neutral faces. In conclusion, psychopathy may affect emotional memory differently across gender. Specifically, undergraduate men, but not women, with higher psychopathy levels may show impaired memory for emotional faces. Implications for future studies of emotional memory and psychopathy are discussed.  相似文献   

13.
Relatively few studies have examined memory bias for social stimuli in depression or dysphoria. The aim of this study was to investigate the influence of depressive symptoms on memory for facial information. A total of 234 participants completed the Beck Depression Inventory II and a task examining memory for facial identity and expression of happy and sad faces. For both facial identity and expression, the recollective experience was measured with the Remember/Know/Guess procedure (Gardiner & Richardson-Klavehn, 2000). The results show no major association between depressive symptoms and memory for identities. However, dysphoric individuals consciously recalled (Remember responses) more sad facial expressions than non-dysphoric individuals. These findings suggest that sad facial expressions led to more elaborate encoding, and thereby better recollection, in dysphoric individuals.  相似文献   

14.
Relatively few studies have examined memory bias for social stimuli in depression or dysphoria. The aim of this study was to investigate the influence of depressive symptoms on memory for facial information. A total of 234 participants completed the Beck Depression Inventory II and a task examining memory for facial identity and expression of happy and sad faces. For both facial identity and expression, the recollective experience was measured with the Remember/Know/Guess procedure (Gardiner & Richardson-Klavehn, 2000). The results show no major association between depressive symptoms and memory for identities. However, dysphoric individuals consciously recalled (Remember responses) more sad facial expressions than non-dysphoric individuals. These findings suggest that sad facial expressions led to more elaborate encoding, and thereby better recollection, in dysphoric individuals.  相似文献   

15.
Children who experienced autism, mental retardation, and language disorders; and, children in a clinical control group were shown photographs of human female, orangutan, and canine (boxer) faces expressing happiness, sadness, anger, surprise and a neutral expression. For each species of faces, children were asked to identify the happy, sad, angry, or surprised expressions. In Experiment 1, error patterns suggested that children who experienced autism were attending to features of the lower face when making judgements about emotional expressions. Experiment 2 supported this impression. When recognizing facial emotion, children without autism performed better when viewing the full face, compared to the upper and lower face alone. Children with autism performed no better when viewing the full face than they did when viewing partial faces; and, performed no better than chance when viewing the upper face alone. The results are discussed with respect to differences in the manner that children with and without autism process social information communicated by the face.  相似文献   

16.
To inform how emotions in speech are implicitly processed and registered in memory, we compared how emotional prosody, emotional semantics, and both cues in tandem prime decisions about conjoined emotional faces. Fifty-two participants rendered facial affect decisions (Pell, 2005a), indicating whether a target face represented an emotion (happiness or sadness) or not (a facial grimace), after passively listening to happy, sad, or neutral prime utterances. Emotional information from primes was conveyed by: (1) prosody only; (2) semantic cues only; or (3) combined prosody and semantic cues. Results indicated that prosody, semantics, and combined prosody–semantic cues facilitate emotional decisions about target faces in an emotion-congruent manner. However, the magnitude of priming did not vary across tasks. Our findings highlight that emotional meanings of prosody and semantic cues are systematically registered during speech processing, but with similar effects on associative knowledge about emotions, which is presumably shared by prosody, semantics, and faces.  相似文献   

17.
Detection of angry, happy and sad faces among neutral backgrounds was investigated in three single emotion tasks and an emotion comparison task using schematic (Experiment 1) and photographic faces (Experiment 2). Both experiments provided evidence for the preferential detection of anger displays over displays of other negative or positive emotions in tasks that employed all three target emotions. Evidence for preferential detection of negative emotion in general was found only with schematic faces. The present results are consistent with the notion that the detection of displays of anger, and to some extent sadness, does not reflect on a pre-attentive mechanism, but is the result of a more efficient visual search than is the detection of positive emotion.  相似文献   

18.
It is well-known that patients having sustained frontal-lobe traumatic brain injury (TBI) are severely impaired on tests of emotion recognition. Indeed, these patients have significant difficulty recognizing facial expressions of emotion, and such deficits are often associated with decreased social functioning and poor quality of life. As of yet, no studies have examined the response patterns which underlie facial emotion recognition impairment in TBI and which may lend clarity to the interpretation of deficits. Therefore, the present study aimed to characterize response patterns in facial emotion recognition in 14 patients with frontal TBI compared to 22 matched control subjects, using a task which required participants to rate the intensity of each emotion (happiness, sadness, anger, disgust, surprise and fear) of a series of photographs of emotional and neutral faces. Results first confirmed the presence of facial emotion recognition impairment in TBI, and further revealed that patients displayed a liberal bias when rating facial expressions, leading them to associate intense ratings of incorrect emotional labels to sad, disgusted, surprised and fearful facial expressions. These findings are generally in line with prior studies which also report important facial affect recognition deficits in TBI patients, particularly for negative emotions.  相似文献   

19.
We systematically examined the impact of emotional stimuli on time perception in a temporal reproduction paradigm where participants reproduced the duration of a facial emotion stimulus using an oval-shape stimulus or vice versa. Experiment 1 asked participants to reproduce the duration of an angry face (or the oval) presented for 2,000 ms. Experiment 2 included a range of emotional expressions (happy, sad, angry, and neutral faces as well as the oval stimulus) presented for different durations (500, 1,500, and 2,000 ms). We found that participants over-reproduced the durations of happy and sad faces using the oval stimulus. By contrast, there was a trend of under-reproduction when the duration of the oval stimulus was reproduced using the angry face. We suggest that increased attention to a facial emotion produces the relativity of time perception.  相似文献   

20.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号