首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In two experiments, ostracized individuals showed more pronounced categorical perception of inclusion- and exclusion-related stimuli. Specifically, ostracism enhanced the ability to distinguish between-category differences (e.g., between happy and angry faces) relative to within-category differences (e.g., between two happy expressions). Participants were socially included or excluded via Cyberball (a virtual ball-tossing task). In Experiment 1, ostracized participants showed greater perceptual acuity in distinguishing between subtly happy and angry expressions combined with a reduced ability to discriminate expressions within each expression category. Experiment 2 found analogous categorical perception effects for targets varying on the dimension of race. Importantly, this effect was specific to social information; categorical perception of non-social objects was not qualified by social exclusion. These results suggest that ostracism exacerbates categorical perception, attuning perceivers to the differences between various inclusion- and exclusion-related categories relative to within category acuity, making the world appear more ‘black-and-white’ than it might otherwise.  相似文献   

2.
3个实验逐步深入地考察了囚徒困境博弈中, 对手的高兴、中性和愤怒面部表情对个体合作行为的影响及相关变量的中介和调节作用。实验1的结果表明, 对手的高兴表情比愤怒表情诱发了更高的合作水平, 且高兴和中性表情均比愤怒表情产生了更高的合作预期, 合作预期中介了面部表情与合作行为的关系; 实验2引入指导语操纵被试的直觉或理性决策模式, 发现实验1的结果仅出现在直觉条件下, 却未出现在理性条件下, 且整体而言直觉决策模式下比理性决策模式下更合作; 实验3采用更加严格的时间压力范式操纵被试的直觉或理性决策模式, 发现除了高兴表情比中性表情也诱发了更多的合作行为外, 其他几乎复制了实验2的结果。基于这些结果, 建立了有调节的中介模型, 以期揭示他人面部表情、合作预期、合作行为及个体决策模式之间的复杂关系。  相似文献   

3.
We examined dysfunctional memory processing of facial expressions in relation to alexithymia. Individuals with high and low alexithymia, as measured by the Toronto Alexithymia Scale (TAS-20), participated in a visual search task (Experiment 1A) and a change-detection task (Experiments 1B and 2), to assess differences in their visual short-term memory (VSTM). In the visual search task, the participants were asked to judge whether all facial expressions (angry and happy faces) in the search display were the same or different. In the change-detection task, they had to decide whether all facial expressions changed between successive two displays. We found individual differences only in the change-detection task. Individuals with high alexithymia showed lower sensitivity for the happy faces compared to the angry faces, while individuals with low alexithymia showed sufficient recognition for both facial expressions. Experiment 2 examined whether individual differences were observed during early storage or later retrieval stage of the VSTM process using a single-probe paradigm. We found no effect of single-probe, indicating that individual differences occurred at the storage stage. The present results provide new evidence that individuals with high alexithymia show specific impairment in VSTM processes (especially the storage stage) related to happy but not to angry faces.  相似文献   

4.
A new model of mental representation is applied to social cognition: the attractor field model. Using the model, the authors predicted and found a perceptual advantage but a memory disadvantage for faces displaying evaluatively congruent expressions. In Experiment 1, participants completed a same/different perceptual discrimination task involving morphed pairs of angry-to-happy Black and White faces. Pairs of faces displaying evaluatively incongruent expressions (i.e., happy Black, angry White) were more likely to be labeled as similar and were less likely to be accurately discriminated from one another than faces displaying evaluatively congruent expressions (i.e., angry Black, happy White). Experiment 2 replicated this finding and showed that objective discriminability of stimuli moderated the impact of attractor field effects on perceptual discrimination accuracy. In Experiment 3, participants completed a recognition task for angry and happy Black and White faces. Consistent with the attractor field model, memory accuracy was better for faces displaying evaluatively incongruent expressions. Theoretical and practical implications of these findings are discussed.  相似文献   

5.
The present electromyographic study is a first step toward shedding light on the involvement of affective processes in congruent and incongruent facial reactions to facial expressions. Further, empathy was investigated as a potential mediator underlying the modulation of facial reactions to emotional faces in a competitive, a cooperative, and a neutral setting. Results revealed less congruent reactions to happy expressions and even incongruent reactions to sad and angry expressions in the competition condition, whereas virtually no differences between the neutral and the cooperation condition occurred. Effects on congruent reactions were found to be mediated by cognitive empathy, indicating that the state of empathy plays an important role in the situational modulation of congruent reactions. Further, incongruent reactions to sad and angry faces in a competition setting were mediated by the emotional reaction of joy, supporting the assumption that incongruent facial reactions are mainly based on affective processes. Additionally, strategic processes (specifically, the goal to create and maintain a smooth, harmonious interaction) were found to influence facial reactions while being in a cooperative mindset. Now, further studies are needed to test for the generalizability of these effects.  相似文献   

6.
Event-related brain potentials were measured in 7- and 12-month-old infants to examine the development of processing happy and angry facial expressions. In 7-month-olds a larger negativity to happy faces was observed at frontal, central, temporal and parietal sites (Experiment 1), whereas 12-month-olds showed a larger negativity to angry faces at occipital sites (Experiment 2). These data suggest that processing of these facial expressions undergoes development between 7 and 12 months: while 7-month-olds exhibit heightened sensitivity to happy faces, 12-month-olds resemble adults in their heightened sensitivity to angry faces. In Experiment 3 infants' visual preference was assessed behaviorally, revealing that the differences in ERPs observed at 7 and 12 months do not simply reflect differences in visual preference.  相似文献   

7.
The anger-superiority hypothesis states that angry faces are detected more efficiently than friendly faces. Previously research used schematized stimuli, which minimizes perceptual confounds, but violates ecological validity. The authors argue that a confounding of appearance and meaning is unavoidable and even unproblematic if real faces are presented. Four experiments tested carefully controlled photos in a search-asymmetry design. Experiments 1 and 2 revealed more efficient detection of an angry face among happy faces than vice versa. Experiment 3 indicated that the advantage was due to the mouth, but not to the eyes, and Experiment 4, using upright and inverted thatcherized faces, suggests a perceptual basis. The results are in line with a sensory-bias hypothesis that facial expressions evolved to exploit extant capabilities of the visual system.  相似文献   

8.
Using a visual search paradigm, we investigated how age affected attentional bias to emotional facial expressions. In Experiments 1 and 2, participants searched for a discrepant facial expression in a matrix of otherwise homogeneous faces. Both younger and older adults showed a more effective search when the discrepant face was angry rather than happy or neutral. However, when the angry faces served as non-target distractors, younger adults' search was less effective than happy or neutral distractor conditions. In contrast, older adults showed a more efficient search with angry distractors than happy or neutral distractors, indicating that older adults were better able to inhibit angry facial expressions. In Experiment 3, we found that even a top-down search goal could not override the angry face superiority effect in guiding attention. In addition, RT distribution analyses supported that both younger and older adults performed the top-down angry face search qualitatively differently from the top-down happy face search. The current research indicates that threat face processing involves automatic attentional shift and a controlled attentional process. The current results suggest that age only influenced the controlled attentional process.  相似文献   

9.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently.  相似文献   

10.
Is it easier to detect angry or happy facial expressions in crowds of faces? The present studies used several variations of the visual search task to assess whether people selectively attend to expressive faces. Contrary to widely cited studies (e.g., ?hman, Lundqvist, & Esteves, 2001) that suggest angry faces "pop out" of crowds, our review of the literature found inconsistent evidence for the effect and suggested that low-level visual confounds could not be ruled out as the driving force behind the anger superiority effect. We then conducted 7 experiments, carefully designed to eliminate many of the confounding variables present in past demonstrations. These experiments showed no evidence that angry faces popped out of crowds or even that they were efficiently detected. These experiments instead revealed a search asymmetry favoring happy faces. Moreover, in contrast to most previous studies, the happiness superiority effect was shown to be robust even when obvious perceptual confounds--like the contrast of white exposed teeth that are typically displayed in smiling faces--were eliminated in the happy targets. Rather than attribute this effect to the existence of innate happiness detectors, we speculate that the human expression of happiness has evolved to be more visually discriminable because its communicative intent is less ambiguous than other facial expressions.  相似文献   

11.
Threatening faces involuntarily grab attention in socially anxious individuals. It is unclear, however, whether attention capture is at the expense of concurrent visual processing. The current study examined the perceptual cost effects of viewing fear-relevant stimuli (threatening faces) relative to a concurrent change-detection task. Steady-state visual evoked potentials (ssVEPs) were used to separate the neural response to 2 fully overlapping types of stimuli flickering at different frequencies: Task-irrelevant facial expressions (angry, neutral, happy) were overlaid with a task-relevant Gabor patch stream, which required a response to rare phase reversals. Groups of 17 high and 17 low socially anxious observers were recruited through online prescreening of 849 students. A prominent competition effect of threatening faces was observed solely in elevated social anxiety: When an angry face, relative to a neutral or happy face, served as a distractor, heightened ssVEP amplitudes were seen at the tagging frequency of that facial expression. Simultaneously, the ssVEP evoked by the task-relevant Gabor grating was reliably diminished compared with conditions with neutral or happy distractor faces. Thus, threatening faces capture and hold low-level perceptual resources in viewers symptomatic for social anxiety at the cost of a concurrent primary task. It is important to note that this competition in lower tier visual cortex was maintained throughout the viewing period and was unaccompanied by competition effects on behavioral performance. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

12.
Sato W  Yoshikawa S 《Cognition》2007,104(1):1-18
Based on previous neuroscientific evidence indicating activation of the mirror neuron system in response to dynamic facial actions, we hypothesized that facial mimicry would occur while subjects viewed dynamic facial expressions. To test this hypothesis, dynamic/static facial expressions of anger/happiness were presented using computer-morphing (Experiment 1) and videos (Experiment 2). The subjects' facial actions were unobtrusively videotaped and blindly coded using Facial Action Coding System [FACS; Ekman, P., & Friesen, W. V. (1978). Facial action coding system. Palo Alto, CA: Consulting Psychologist]. In the dynamic presentations common to both experiments, brow lowering, a prototypical action in angry expressions, occurred more frequently in response to angry expressions than to happy expressions. The pulling of lip corners, a prototypical action in happy expressions, occurred more frequently in response to happy expressions than to angry expressions in dynamic presentations. Additionally, the mean latency of these actions was less than 900 ms after the onset of dynamic changes in facial expression. Naive raters recognized the subjects' facial reactions as emotional expressions, with the valence corresponding to the dynamic facial expressions that the subjects were viewing. These results indicate that dynamic facial expressions elicit spontaneous and rapid facial mimicry, which functions both as a form of intra-individual processing and as inter-individual communication.  相似文献   

13.
Emotional and affective processing imposes itself over cognitive processes and modulates our perception of the surrounding environment. In two experiments, we addressed the issue of whether nonconscious processing of affect can take place even under deep states of unawareness, such as those induced by interocular suppression techniques, and can elicit an affective response that can influence our understanding of the surrounding environment. In Experiment 1, participants judged the likeability of an unfamiliar item—a Chinese character—that was preceded by a face expressing a particular emotion (either happy or angry). The face was rendered invisible through an interocular suppression technique (continuous flash suppression; CFS). In Experiment 2, backward masking (BM), a less robust masking technique, was used to render the facial expressions invisible. We found that despite equivalent phenomenological suppression of the visual primes under CFS and BM, different patterns of affective processing were obtained with the two masking techniques. Under BM, nonconscious affective priming was obtained for both happy and angry invisible facial expressions. However, under CFS, nonconscious affective priming was obtained only for angry facial expressions. We discuss an interpretation of this dissociation between affective processing and visual masking techniques in terms of distinct routes from the retina to the amygdala.  相似文献   

14.
Traditionally, anxiety has been associated with a selective attentional bias for threat and a decreased capacity in attentional control. In two different experiments, we investigated whether individuals with different levels of self-reported state anxiety (Experiment 1) and induced anxiety (Experiment 2) had impaired response inhibition processes (attentional control deficit) as characterized by a different response style in the presence of negative stimuli under low and high perceptual load conditions. A go/no-go paradigm with emotional distractors (angry, happy, and neutral faces) was used to provide measures of perceptual sensitivity, inhibition, and response style. Our findings showed that perceptual sensitivity, as assessed by the d' parameter of signal detection theory, was reduced in all participants for angry faces under low perceptual load, where enough perceptual resources were available to be attracted by distractors. Importantly, despite similar perceptual sensitivity, the beta parameter indicated that high state anxiety individuals in both experiments were less flexible at adjusting to task demands in the presence of angry face distractors by adopting a stricter criterion. Implications of findings are discussed within current models of attentional control in anxiety.  相似文献   

15.
Recent studies of the face in the crowd effect, the faster detection of angry than of happy faces in visual search, suggest that for schematic faces it reflects on perceptual features like inward pointing lines rather than on emotional expressions. Removing a potential confound, Experiments 1–2 replicate the preferential detection of stimuli with inward pointing lines, but Experiment 2a indicates that a surrounding circle is required for the effect to emerge. Experiments 3–7 failed to find evidence for faster detection of schematic faces comprising only the elements critical for the faster detection of angry faces according to a low level visual feature account, inward tilted brows and upturned mouth. Faster detection of anger was evident if eyes or eyes and noses were added, but only if their placement was consistent with the first order relations among these elements in a human face. Drawing the critical elements in thicker, higher contrast lines also led to an anger advantage, but this was smaller than that seen for the complete faces. The present results suggest that, while able to support faster target detection, a prevalence of inward pointing lines is not sufficient to explain the detection advantage of angry schematic faces.  相似文献   

16.
We investigated whether and how emotional facial expressions affect sustained attention in face tracking. In a multiple-identity and object tracking paradigm, participants tracked multiple target faces that continuously moved around together with several distractor faces, and subsequently reported where each target face had moved to. The emotional expression (angry, happy, and neutral) of the target and distractor faces was manipulated. Tracking performance was better when the target faces were angry rather than neutral, whereas angry distractor faces did not affect tracking. The effect persisted when the angry faces were presented upside-down and when surface features of the faces were irrelevant to the ongoing task. There was only suggestive and weak evidence for a facilitatory effect of happy targets and a distraction effect of happy distractors in comparison to neutral faces. The results show that angry expressions on the target faces can facilitate sustained attention on the targets via increased vigilance, yet this effect likely depends on both emotional information and visual features of the angry faces.  相似文献   

17.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression.  相似文献   

18.
Although facial expressions are thought to vary in their functional impact on perceivers, experimental demonstration of the differential effects of facial expressions on behavior are lacking. In the present study, we examined the effects of exposure to facial expressions on visual search efficiency. Participants (n = 31) searched for a target in a 12 location circle array after exposure to an angry, disgusted, fearful, happy, or neutral facial expression for 100 ms or 500 ms. Consistent with predictions, exposure to a fearful expression prior to visual search resulted in faster target identification compared to exposure to other facial expressions. The effects of other facial expressions on visual search did not differ from each other. The fear facilitating effect on visual search efficiency was observed at 500-ms but not at 100-ms presentations, suggesting a specific temporal course of the facilitation. Subsequent analysis also revealed that individual differences in fear of negative evaluation, trait anxiety, and obsessive-compulsive symptoms possess a differential pattern of association with visual search efficiency. The experimental and clinical implications of these findings are discussed.  相似文献   

19.
This paper reports three studies in which stronger orienting to perceived eye gaze direction was revealed when observers viewed faces showing fearful or angry, compared with happy or neutral, emotional expressions. Gaze-related spatial cueing effects to laterally presented fearful faces and centrally presented angry faces were also modulated by the anxiety level of participants, with high- but not low-state anxious individuals revealing enhanced shifts of attention. In contrast, both high- and low-state anxious individuals demonstrated enhanced orienting to averted gaze when viewing laterally presented angry faces. These results provide novel evidence for the rapid integration of facial expression and gaze direction information, and for the regulation of gaze-cued attention by both the emotion conveyed in the perceived face and the degree of anxiety experienced by the observer.  相似文献   

20.
Findings of 7 studies suggested that decisions about the sex of a face and the emotional expressions of anger or happiness are not independent: Participants were faster and more accurate at detecting angry expressions on male faces and at detecting happy expressions on female faces. These findings were robust across different stimulus sets and judgment tasks and indicated bottom-up perceptual processes rather than just top-down conceptually driven ones. Results from additional studies in which neutrally expressive faces were used suggested that the connections between masculine features and angry expressions and between feminine features and happy expressions might be a property of the sexual dimorphism of the face itself and not merely a result of gender stereotypes biasing the perception.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号