首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In two studies, we used a negative affective priming task with pictures of angry (Study 1), sad (Study 2), and happy faces (Studies 1 and 2) to measure attentional inhibition of emotional stimuli as a function of attachment style. Results showed that attachment avoidance was associated with a stronger inhibition of both angry and sad faces. This indicates that the regulatory strategies of avoidant individuals involve inhibition of different types of negative, but not positive, stimuli. Attachment anxiety, on the other hand, showed no association with inhibitory responding to negative stimuli, although we did find indications of impaired inhibitory processing of happy faces in Study 1. The results are discussed in relation to current evidence on avoidant affect-regulation strategies.  相似文献   

2.
抑郁个体对情绪面孔的返回抑制能力不足   总被引:2,自引:0,他引:2  
戴琴  冯正直 《心理学报》2009,41(12):1175-1188
探讨抑郁对情绪面孔返回抑制能力的影响。以贝克抑郁量表、自评抑郁量表、CCMD-3和汉密顿抑郁量表为工具筛选出了正常对照组、抑郁康复组和抑郁患者组各17名被试进行了真人情绪面孔线索-靶子任务的行为学实验和事件相关电位(ERP)实验。在线索靶子范式中, 靶子在线索消失后出现, 被试对靶子的位置作出反应。行为学实验显示线索靶子间隔时间(stimulus onset asynchronies, SOA)为14ms时, 正常对照组对中性面孔有返回抑制效应, 抑郁康复组对所有面孔均存在返回抑制效应, 患者组对愤怒、悲伤面孔和中性面孔存在返回抑制效应; SOA为250ms时三组被试均对悲伤面孔存在返回抑制能力不足, 以患者组最突出, 康复组对高兴面孔存在返回抑制能力不足; SOA为750ms时正常组对悲伤面孔存在返回抑制效应, 康复组对高兴和悲伤面孔存在返回抑制能力不足, 患者组对悲伤面孔存在返回抑制能力不足, 对愤怒面孔存在返回抑制效应。在SOA为750ms的条件下, ERP波形特点为正常组对高兴面孔线索P3波幅大于其他组, 对高兴面孔无效提示P1波幅小于其他面孔, 对悲伤面孔有效提示P1波幅小于高兴面孔, 对高兴面孔有效提示P3波幅大于患者组, 对悲伤面孔无效提示P3波幅大于其他组; 康复组对悲伤面孔线索P3波幅大于其他面孔, 对高兴面孔有效提示P3波幅大于患者组, 对悲伤面孔无效提示P3波幅小于正常组; 患者组对悲伤面孔线索P1波幅大于其他组、P3波幅大于其他面孔, 对悲伤面孔无效提示P3波幅小于正常组, 高兴面孔有效提示P3波幅小于其他组。提示抑郁患者对负性刺激有返回抑制能力不足, 这种对负性刺激抑制能力的缺失导致抑郁个体难以抗拒负性事件的干扰而受到不良情绪状态的困扰, 所以他们可能更多的体验到抑郁情绪, 并致使抑郁持续和发展。而抑郁康复个体对高兴、悲伤面孔均有返回抑制能力不足, 这让康复个体能同时感受到正、负性刺激, 从而能保持一种认知和情绪上特定的平衡。  相似文献   

3.
采用修订的情绪觉察水平量表(LEAS)对315名职前教师进行调查,并从中筛选出高、低情绪觉察能力组被试各60人,分别完成情绪面孔Stroop任务(研究1)和情绪词Stroop任务(研究2)。(1)研究1的正确率上,中性面孔最高,愉快面孔次之,悲伤面孔最差;反应时上,悲伤面孔最长,愉快面孔次之,中性面孔最短;高分组仅在愉快和中性面孔上的反应时长于低分组;消极面孔的干扰效应高于积极面孔的干扰效应。(2)研究2的反应时上,消极词的反应时最长,显著长于中性和积极词;高分组仅在积极词和中性词上的反应时长于低分组,且高分组在积极词上的干扰效应高于低分组。研究结果表明与中性刺激相比,高低情绪觉察能力组均对情绪刺激产生了注意偏向,尤其是负性情绪刺激;与低情绪觉察能力组相比,高情绪觉察能力职前教师不仅对消极情绪信息产生注意偏向,还对积极情绪信息产生注意偏向。  相似文献   

4.
Affective words and faces each seem to be evaluated automatically, but it is unclear if they differ from one another in perceptual salience as measured by automaticity. The current study examined a possible hierarchy among affective stimuli using a modified photo–word Stroop task. Positive and negative words were superimposed across faces expressing positive (happy) and negative (angry, sad) emotions. Participants categorised the valence of faces and words. Across two experiments, interference effects were seen for both stimulus types. However, larger interference effects occurred during judgements of words than expressions, suggesting affective faces are processed more automatically than affective words. When the strength of positive and negative expressions is compared, angry expressions resulted in a larger interference effect than sad, and happy expressions produce interference similar to that of angry faces. The latter result contrasts with research suggesting potential threat stimuli are processed more automatically than positive stimuli. Implications are discussed.  相似文献   

5.
Infants’ ability to discriminate emotional facial expressions and tones of voice is well-established, yet little is known about infant discrimination of emotional body movements. Here, we asked if 10–20-month-old infants rely on high-level emotional cues or low-level motion related cues when discriminating between emotional point-light displays (PLDs). In Study 1, infants viewed 18 pairs of angry, happy, sad, or neutral PLDs. Infants looked more at angry vs. neutral, happy vs. neutral, and neutral vs. sad. Motion analyses revealed that infants preferred the PLD with more total body movement in each pairing. Study 2, in which infants viewed inverted versions of the same pairings, yielded similar findings except for sad-neutral. Study 3 directly paired all three emotional stimuli in both orientations. The angry and happy stimuli did not significantly differ in terms of total motion, but both had more motion than the sad stimuli. Infants looked more at angry vs. sad, more at happy vs. sad, and about equally to angry vs. happy in both orientations. Again, therefore, infants preferred PLDs with more total body movement. Overall, the results indicate that a low-level motion preference may drive infants’ discrimination of emotional human walking motions.  相似文献   

6.
We examined proactive and reactive control effects in the context of task-relevant happy, sad, and angry facial expressions on a face-word Stroop task. Participants identified the emotion expressed by a face that contained a congruent or incongruent emotional word (happy/sad/angry). Proactive control effects were measured in terms of the reduction in Stroop interference (difference between incongruent and congruent trials) as a function of previous trial emotion and previous trial congruence. Reactive control effects were measured in terms of the reduction in Stroop interference as a function of current trial emotion and previous trial congruence. Previous trial negative emotions exert greater influence on proactive control than the positive emotion. Sad faces in the previous trial resulted in greater reduction in the Stroop interference for happy faces in the current trial. However, current trial angry faces showed stronger adaptation effects compared to happy faces. Thus, both proactive and reactive control mechanisms are dependent on emotional valence of task-relevant stimuli.  相似文献   

7.
ABSTRACT

Background and objectives: Although research supports the premise that depressed and socially anxious individuals direct attention preferentially toward negative emotional cues, little is known about how attention to positive emotional cues might modulate this negative attention bias risk process. The purpose of this study was to determine if associations between attention biases to sad and angry faces and depression and social anxiety symptoms, respectively, would be strongest in individuals who also show biased attention away from happy faces.

Methods: Young adults (N?=?151; 79% female; M?=?19.63 years) completed self-report measures of depression and social anxiety symptoms and a dot probe task to assess attention biases to happy, sad, and angry facial expressions.

Results: Attention bias to happy faces moderated associations between attention to negatively valenced faces and psychopathology symptoms. However, attention bias toward sad faces was positively and significantly related to depression symptoms only for individuals who also selectively attended toward happy faces. Similarly, attention bias toward angry faces was positively and significantly associated with social anxiety symptoms only for individuals who also selectively attended toward happy faces.

Conclusions: These findings suggest that individuals with high levels of depression or social anxiety symptoms attend preferentially to emotional stimuli across valences.  相似文献   

8.
The goal of this research was to examine the effects of facial expressions on the speed of sex recognition. Prior research revealed that sex recognition of female angry faces was slower compared with male angry faces and that female happy faces are recognized faster than male happy faces. We aimed to replicate and extend the previous research by using different set of facial stimuli, different methodological approach and also by examining the effects of some other previously unexplored expressions (such as crying) on the speed of sex recognition. In the first experiment, we presented facial stimuli of men and women displaying anger, fear, happiness, sadness, crying and three control conditions expressing no emotion. Results showed that sex recognition of angry females was significantly slower compared with sex recognition in any other condition, while sad, crying, happy, frightened and neutral expressions did not impact the speed of sex recognition. In the second experiment, we presented angry, neutral and crying expressions in blocks and again only sex recognition of female angry expressions was slower compared with all other expressions. The results are discussed in a context of perceptive features of male and female facial configuration, evolutionary theory and social learning context.  相似文献   

9.
We systematically examined the impact of emotional stimuli on time perception in a temporal reproduction paradigm where participants reproduced the duration of a facial emotion stimulus using an oval-shape stimulus or vice versa. Experiment 1 asked participants to reproduce the duration of an angry face (or the oval) presented for 2,000 ms. Experiment 2 included a range of emotional expressions (happy, sad, angry, and neutral faces as well as the oval stimulus) presented for different durations (500, 1,500, and 2,000 ms). We found that participants over-reproduced the durations of happy and sad faces using the oval stimulus. By contrast, there was a trend of under-reproduction when the duration of the oval stimulus was reproduced using the angry face. We suggest that increased attention to a facial emotion produces the relativity of time perception.  相似文献   

10.
The aim was to establish if the memory bias for sad faces, reported in clinically depressed patients (Gilboa-Schechtman, Erhard Weiss, & Jeczemien, 2002; Ridout, Astell, Reid, Glen, & O'Carroll, 2003) generalises to sub-clinical depression (dysphoria) and experimentally induced sadness. Study 1: dysphoric (n = 24) and non-dysphoric (n = 20) participants were presented with facial stimuli, asked to identify the emotion portrayed and then given a recognition memory test for these faces. At encoding, dysphoric participants (DP) exhibited impaired identification of sadness and neutral affect relative to the non-dysphoric group (ND). At memory testing, DP exhibited superior memory for sad faces relative to happy and neutral. They also exhibited enhanced memory for sad faces and impaired memory for happy relative to the ND. Study 2: non-depressed participants underwent a positive (n = 24) or negative (n = 24) mood induction (MI) and were assessed on the same tests as Study 1. At encoding, negative MI participants showed superior identification of sadness, relative to neutral affect and compared to the positive MI group. At memory testing, the negative MI group exhibited enhanced memory for the sad faces relative to happy or neutral and compared to the positive MI group. Conclusion: MCM bias for sad faces generalises from clinical depression to these sub-clinical affective states.  相似文献   

11.
The present work represents the first study to investigate the relationship between adult attachment avoidance and anxiety and automatic affective responses to basic facial emotions. Subliminal affective priming methods allowed for the assessment of unconscious affective reactions. An affective priming task using masked sad and happy faces, both of which are approach‐related facial expressions, was administered to 30 healthy volunteers. Participants also completed the Relationship Scales Questionnaire and measures of anxiety and depression. Attachment avoidance was negatively associated with affective priming due to sad (but not happy) facial expressions. This association occurred independently of attachment anxiety, depressivity, and trait anxiety. Attachment anxiety was not correlated with priming due to sad or happy facial expressions. The present results are consistent with the assumption that attachment avoidance moderates automatic affective reaction to sad faces. Our data indicate that avoidant attachment is related to a low automatic affective responsivity to sad facial expressions.  相似文献   

12.
Whereas attentional interference of negative information has previously been assumed to be automatic, the present research hypothesised that this effect depends on the availability of working-memory resources. In two experiments, participants judged the gender of angry versus happy faces. Working-memory load was manipulated by the presence or absence of a math task (Study 1) or mental rehearsal of a one- versus 8-digit number (Study 2). The results showed that angry faces interfered more with gender naming than happy faces, but only when working-memory load was low. As such, attentional interference of negative stimulus features can be modulated by top-down attentional control processes.  相似文献   

13.
Infant expressions are important signals for eliciting caregiving behaviors in parents. The present study sought to test if infant expressions affect adults’ behavioral response, taking into account the role of a mood induction and childhood caregiving experiences. A modified version of the Approach Avoidance Task (AAT) was employed to study nulliparous female university students’ implicit responses to infant faces with different expressions. Study 1 showed that sad, neutral and sleepy expressions elicit a tendency for avoidance, while no tendency for approach or avoidance was found for happy faces. Notably, differences between approach and avoidance response latencies for sad faces and participants’ negative caregiving experiences were positively correlated (r = 0.30, p = 0.04, Bonferroni corrected), indicating that individuals who experienced insensitive parental care show more bias toward sad infant faces. In Study 2, we manipulated participants' current mood (inducing sad and happy mood by asking to recall a happy or sad event of their recent life) before the AAT. Results showed that sad mood enhanced the bias toward sad faces that is buffered by positive mood induction. In conclusion, these findings indicate that implicit approach avoidance behaviors in females depend on the emotional expression of infant faces and are associated with childhood caregiving experiences and current mood.  相似文献   

14.
Human faces are among the most important visual stimuli that we encounter at all ages. This importance partly stems from the face as a conveyer of information on the emotional state of other individuals. Previous research has demonstrated specific scanning patterns in response to threat-related compared to non-threat-related emotional expressions. This study investigated how visual scanning patterns toward faces which display different emotional expressions develop during infancy. The visual scanning patterns of 4-month-old and 7-month-old infants and adults when looking at threat-related (i.e., angry and fearful) versus non-threat-related (i.e., happy, sad, and neutral) emotional faces were examined. We found that infants as well as adults displayed an avoidant looking pattern in response to threat-related emotional expressions with reduced dwell times and relatively less fixations to the inner features of the face. In addition, adults showed a pattern of eye contact avoidance when looking at threat-related emotional expressions that was not yet present in infants. Thus, whereas a general avoidant reaction to threat-related facial expressions appears to be present from very early in life, the avoidance of eye contact might be a learned response toward others' anger and fear that emerges later during development.  相似文献   

15.
Threatening facial expressions can signal the approach of someone or something potentially dangerous. Past research has established that adults have an attentional bias for angry faces, visually detecting their presence more quickly than happy or neutral faces. Two new findings are reported here. First, evidence is presented that young children share this attentional bias. In five experiments, young children and adults were asked to find a picture of a target face among an array of eight distracter faces. Both age groups detected threat‐relevant faces – angry and frightened – more rapidly than non‐threat‐relevant faces (happy and sad). Second, evidence is presented that both adults and children have an attentional bias for negative stimuli overall. All negative faces were detected more quickly than positive ones in both age groups. As the first evidence that young children exhibit the same superior detection of threatening facial expressions as adults, this research provides important support for the existence of an evolved attentional bias for threatening stimuli.  相似文献   

16.
Although biased attention to emotional stimuli is considered a vulnerability factor for anxiety and dysphoria, research has infrequently related such attentional biases to dimensional models of vulnerability for anxiety and mood disorders. In two studies (Study 1, n = 64; Study 2, n = 168), we evaluate the differential associations of general negative affectivity, anxiety, and dysphoria with biases in selective attention among nonclinical participants selected to vary in both anxiety and dysphoria. Across both studies, preferential processing of angry faces at a 300-ms exposure duration was associated with a general tendency to experience a range of negative affect, rather than being specific to symptoms of either anxiety or dysphoria. In the second study, we found evidence of a suppressor relationship between anxiety and dysphoria in the prediction of delayed attentional biases (1,000 ms) for sad faces. In particular, dysphoria was specifically associated with biased attention toward sad cues, but only after statistically accounting for anxiety; by contrast, anxiety was specifically associated with attentional avoidance of sad cues, but only after statistically accounting for dysphoria. These results suggest that the specificity of relationships between components of negative affectivity and attention to emotional stimuli varies as a function of the time course at which attentional biases are assessed, highlighting the importance of evaluating both anxiety and dysphoria in research on attentional processing of emotional stimuli. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

17.
Previous binocular rivalry studies with younger adults have shown that emotional stimuli dominate perception over neutral stimuli. Here we investigated the effects of age on patterns of emotional dominance during binocular rivalry. Participants performed a face/house rivalry task where the emotion of the face (happy, angry, neutral) and orientation (upright, inverted) of the face and house stimuli were varied systematically. Age differences were found with younger adults showing a general emotionality effect (happy and angry faces were more dominant than neutral faces) and older adults showing inhibition of anger (neutral faces were more dominant than angry faces) and positivity effects (happy faces were more dominant than both angry and neutral faces). Age differences in dominance patterns were reflected by slower rivalry rates for both happy and angry compared to neutral face/house pairs in younger adults, and slower rivalry rates for happy compared to both angry and neutral face/house pairs in older adults. Importantly, these patterns of emotional dominance and slower rivalry rates for emotional-face/house pairs disappeared when the stimuli were inverted. This suggests that emotional valence, and not low-level image features, were responsible for the emotional bias in both age groups. Given that binocular rivalry has a limited role for voluntary control, the findings imply that anger suppression and positivity effects in older adults may extend to more automatic tasks.  相似文献   

18.
The ability to rapidly detect facial expressions of anger and threat over other salient expressions has adaptive value across the lifespan. Although studies have demonstrated this threat superiority effect in adults, surprisingly little research has examined the development of this process over the childhood period. In this study, we examined the efficiency of children's facial processing in visual search tasks. In Experiment 1, children (N=49) aged 8 to 11 years were faster and more accurate in detecting angry target faces embedded in neutral backgrounds than vice versa, and they were slower in detecting the absence of a discrepant face among angry than among neutral faces. This search pattern was unaffected by an increase in matrix size. Faster detection of angry than neutral deviants may reflect that angry faces stand out more among neutral faces than vice versa, or that detection of neutral faces is slowed by the presence of surrounding angry distracters. When keeping the background constant in Experiment 2, children (N=35) aged 8 to 11 years were faster and more accurate in detecting angry than sad or happy target faces among neutral background faces. Moreover, children with higher levels of anxiety were quicker to find both angry and sad faces whereas low anxious children showed an advantage for angry faces only. Results suggest a threat superiority effect in processing facial expressions in young children as in adults and that increased sensitivity for negative faces may be characteristic of children with anxiety problems.  相似文献   

19.
罗新玉  陈睿  高鑫  邹吉林  周仁来 《心理科学》2012,35(6):1289-1293
本研究采用反眼动实验范式,通过BDI和 SDS量表筛选11名抑郁情绪组和12名正常对照组被试,以高兴、中性和悲伤面孔图片作为实验材料,旨在探讨抑郁情绪大学生对情绪刺激的眼动抑制能力。结果发现:抑郁情绪组的正确眼跳反应时慢于正常组;反朝向眼跳任务中,抑郁情绪组在情绪面孔图片,尤其是悲伤面孔图片上的错误率比正常组高。研究表明,抑郁情绪个体的认知加工速度慢于正常个体,且对情绪面孔,尤其对悲伤面孔存在抑制缺陷。  相似文献   

20.
Two studies were conducted to demonstrate that sad and happy moods can cause individuals to be similarly sensitive to the valence of observed stimuli with regard to how effortfully such stimuli are processed. In Study 1, individuals in whom a sad or happy mood had been induced unitized a behavior sequence less finely when its contents were neutral as opposed to positive. Individuals in a neutral mood state maintained a comparable level of unitization regardless of the valence of the behavior sequence. In Study 2, individuals in whom a sad or a happy mood had been induced processed the arguments in a persuasive communication more extensively when its contents were affectively uplifting rather than depressing. Sad individuals showed this pattern only if no prior affective expectation was provided. Taken together, these studies may fit with the notion that under certain conditions sad and happy individuals similarly decrease the amount of information processed from a neutral (Study 1) or depressing (Study 2), relative to a positive, stimulus.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号