首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We used the remember-know procedure (Tulving, 1985 ) to test the behavioural expression of memory following indirect and direct forms of emotional processing at encoding. Participants (N=32) viewed a series of facial expressions (happy, fearful, angry, and neutral) while performing tasks involving either indirect (gender discrimination) or direct (emotion discrimination) emotion processing. After a delay, participants completed a surprise recognition memory test. Our results revealed that indirect encoding of emotion produced enhanced memory for fearful faces whereas direct encoding of emotion produced enhanced memory for angry faces. In contrast, happy faces were better remembered than neutral faces after both indirect and direct encoding tasks. These findings suggest that fearful and angry faces benefit from a recollective advantage when they are encoded in a way that is consistent with the predictive nature of their threat. We propose that the broad memory advantage for happy faces may reflect a form of cognitive flexibility that is specific to positive emotions.  相似文献   

2.
Unconscious facial reactions to emotional facial expressions   总被引:22,自引:0,他引:22  
Studies reveal that when people are exposed to emotional facial expressions, they spontaneously react with distinct facial electromyographic (EMG) reactions in emotion-relevant facial muscles. These reactions reflect, in part, a tendency to mimic the facial stimuli. We investigated whether corresponding facial reactions can be elicited when people are unconsciously exposed to happy and angry facial expressions. Through use of the backward-masking technique, the subjects were prevented from consciously perceiving 30-ms exposures of happy, neutral, and angry target faces, which immediately were followed and masked by neutral faces. Despite the fact that exposure to happy and angry faces was unconscious, the subjects reacted with distinct facial muscle reactions that corresponded to the happy and angry stimulus faces. Our results show that both positive and negative emotional reactions can be unconsciously evoked, and particularly that important aspects of emotional face-to-face communication can occur on an unconscious level.  相似文献   

3.
We investigated the source of the visual search advantage of some emotional facial expressions. An emotional face target (happy, surprised, disgusted, fearful, angry, or sad) was presented in an array of neutral faces. A faster detection was found for happy targets, with angry and, especially, sad targets being detected more poorly. Physical image properties (e.g., luminance) were ruled out as a potential source of these differences in visual search. In contrast, the search advantage is partly due to the facilitated processing of affective content, as shown by an emotion identification task. Happy expressions were identified faster than the other expressions and were less likely to be confounded with neutral faces, whereas misjudgements occurred more often for angry and sad expressions. Nevertheless, the distinctiveness of some local features (e.g., teeth) that are consistently associated with emotional expressions plays the strongest role in the search advantage pattern. When the contribution of these features to visual search was factored out statistically, the advantage disappeared.  相似文献   

4.
Seven experiments investigated the finding that threatening schematic faces are detected more quickly than nonthreatening faces. Threatening faces with v-shaped eyebrows (angry and scheming expressions) were detected more quickly than nonthreatening faces with inverted v-shaped eyebrows (happy and sad expressions). In contrast to the hypothesis that these effects were due to perceptual features unrelated to the face, no advantage was found for v-shaped eyebrows presented in a nonfacelike object. Furthermore, the addition of internal facial features (the eyes, or the nose and mouth) was necessary to produce the detection advantage for faces with v-shaped eyebrows. Overall, the results are interpreted as showing that the v-shaped eyebrow configuration affords easy detection, but only when other internal facial features are present.  相似文献   

5.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently.  相似文献   

6.
The goal of this study was to explore the ability of violent men to recognise facial affect. In contrast to traditional approaches to this research question, we took the effects of the models' sex and different types of violent behaviour into consideration. Data obtained from 71 violent men revealed that they recognised facial expressions of fear (p = .019) and disgust (p = .013) more accurately when displayed by female than male models. The opposite was found for angry faces (p = .006), while the models' sex did not affect the recognition of sad, happy and surprised facial expressions or neutral faces. Furthermore, sexual coercion perpetrators were more accurate than other violent men in the recognition of female facial disgust (p = .006). These results are discussed in the context of social learning theory, and the hypothesis that female facial expressions of disgust could be subtle cues to their sexual infidelity that motivate sexual coercion in some men.  相似文献   

7.
The ability to quickly perceive threatening facial expressions allows one to detect emotional states and respond appropriately. The anger superiority hypothesis predicts that angry faces capture attention faster than happy faces. Previous studies have used photographic (Hansen & Hansen, 1988) and schematic face images (e.g., Eastwood, Smilek, & Merikle, 2001; Ohman, Lunqvist, & Esteves, 2001) in studying the anger superiority effect, but specific confounds due to the construction of stimuli have led to conflicting findings. In the current study, participants performed a visual search for either angry or happy target faces among crowds of novel, perceptually intermediate morph distractors. A threat-detection advantage was evident where participants showed faster reaction times and greater accuracy in detecting angry over happy faces. Search slopes, however, did not significantly differ. Results suggest a threat-detection advantage mediated by serial rather than preattentive processing.  相似文献   

8.
The rapid detection of facial expressions of anger or threat has obvious adaptive value. In this study, we examined the efficiency of facial processing by means of a visual search task. Participants searched displays of schematic faces and were required to determine whether the faces displayed were all the same or whether one was different. Four main results were found: (1) When displays contained the same faces, people were slower in detecting the absence of a discrepant face when the faces displayed angry (or sad/angry) rather than happy expressions. (2) When displays contained a discrepant face people were faster in detecting this when the discrepant face displayed an angry rather than a happy expression. (3) Neither of these patterns for same and different displays was apparent when face displays were inverted, or when just the mouth was presented in isolation. (4) The search slopes for angry targets were significantly lower than for happy targets. These results suggest that detection of angry facial expressions is fast and efficient, although does not "pop-out" in the traditional sense.  相似文献   

9.
In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2008 APA, all rights reserved).  相似文献   

10.
Two experiments competitively test 3 potential mechanisms (negativity inhibiting responses, feature-based accounts, and evaluative context) for the response latency advantage for recognizing happy expressions by investigating how the race of a target can moderate the strength of the effect. Both experiments indicate that target race modulates the happy face advantage, such that European American participants displayed the happy face advantage for White target faces, but displayed a response latency advantage for angry (Experiments 1 and 2) and sad (Experiment 2) Black target faces. This pattern of findings is consistent with an evaluative context mechanism and inconsistent with negativity inhibition and feature-based accounts of the happy face advantage. Thus, the race of a target face provides an evaluative context in which facial expressions are categorized.  相似文献   

11.
The goal of this research was to examine the effects of facial expressions on the speed of sex recognition. Prior research revealed that sex recognition of female angry faces was slower compared with male angry faces and that female happy faces are recognized faster than male happy faces. We aimed to replicate and extend the previous research by using different set of facial stimuli, different methodological approach and also by examining the effects of some other previously unexplored expressions (such as crying) on the speed of sex recognition. In the first experiment, we presented facial stimuli of men and women displaying anger, fear, happiness, sadness, crying and three control conditions expressing no emotion. Results showed that sex recognition of angry females was significantly slower compared with sex recognition in any other condition, while sad, crying, happy, frightened and neutral expressions did not impact the speed of sex recognition. In the second experiment, we presented angry, neutral and crying expressions in blocks and again only sex recognition of female angry expressions was slower compared with all other expressions. The results are discussed in a context of perceptive features of male and female facial configuration, evolutionary theory and social learning context.  相似文献   

12.
This study examined whether 4‐month‐olds (N = 40) could perceptually categorize happy and angry faces, and show appropriate behavior in response to these faces. During the habituation phase, infants were shown the same type of facial expressions (happy or angry) posed by three models, and their behavior in response to those faces was observed. During the test phase immediately after the habituation phase, infants saw a novel emotional expression and a familiar expression posed by a new model, and their looking times were measured. The results indicated that, although 4‐month‐olds could perceptually categorize happy and angry faces accurately, they responded positively to both expression types. These findings suggest that, although infants can perceptually categorize facial expressions at 4 months of age, they require further time to learn the affective meanings of the facial expressions.  相似文献   

13.
Typical adults mimic facial expressions within 1000 ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study evaluated mechanisms underlying RFRs during childhood and examined possible impairment in children with ASD. Experiment 1 found RFRs to happy and angry faces (not fear faces) in 15 typically developing children from 7 to 12 years of age. RFRs of fear (not anger) in response to angry faces indicated an emotional mechanism. In 11 children (8-13 years of age) with ASD, Experiment 2 found undifferentiated RFRs to fear expressions and no consistent RFRs to happy or angry faces. However, as children with ASD aged, matching RFRs to happy faces increased significantly, suggesting the development of processes underlying matching RFRs during this period in ASD.  相似文献   

14.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression.  相似文献   

15.
Event-related potentials (ERPs), accuracy scores, and reaction times were used to examine the recognition of emotional expressions. Adults and 7-year-old children saw upright and inverted chromatic slides of the facial expressions of happiness, fear, surprise, and anger, and were asked to press a button for either "happy" or "angry" faces. A positive-going waveform (P300) was apparent at parietal scalp (Pz) and at left and right temporal scalp. Although the behavioral data were similar for both children and adults (e.g., both had more difficulty recognizing angry expressions than happy ones, and angry expressions were more difficult to recognize upside-down than were happy faces), the ERPs indicated that children responded differently than adults did to happy and angry expressions. Adults showed greater P300 amplitude to happy faces, while children showed greater P300 amplitude to angry faces. In addition, for adults, but not children, there were greater P300 amplitude responses at right vs. left temporal scalp.  相似文献   

16.
Using the item-method directed forgetting paradigm (i.e. intentionally forgetting specified information), we examined directed forgetting of facial identity as a function of facial expression and the sex of the expresser and perceiver. Participants were presented with happy and angry male and female faces cued for either forgetting or remembering, and were then asked to recognise previously studied faces from among a series of neutral faces. For each recognised test face, participants also recalled the face’s previously displayed emotional expression. We found that angry faces were more resistant to forgetting than were happy faces. Furthermore, angry expressions on male faces and happy expressions on female faces were recognised and recalled better than vice versa. Signal detection analyses revealed that male faces gave rise to a greater sensitivity than female faces did, and male participants, but not female participants, showed greater sensitivity to male faces than to female faces. Several theoretical implications are discussed.  相似文献   

17.
We examined dysfunctional memory processing of facial expressions in relation to alexithymia. Individuals with high and low alexithymia, as measured by the Toronto Alexithymia Scale (TAS-20), participated in a visual search task (Experiment 1A) and a change-detection task (Experiments 1B and 2), to assess differences in their visual short-term memory (VSTM). In the visual search task, the participants were asked to judge whether all facial expressions (angry and happy faces) in the search display were the same or different. In the change-detection task, they had to decide whether all facial expressions changed between successive two displays. We found individual differences only in the change-detection task. Individuals with high alexithymia showed lower sensitivity for the happy faces compared to the angry faces, while individuals with low alexithymia showed sufficient recognition for both facial expressions. Experiment 2 examined whether individual differences were observed during early storage or later retrieval stage of the VSTM process using a single-probe paradigm. We found no effect of single-probe, indicating that individual differences occurred at the storage stage. The present results provide new evidence that individuals with high alexithymia show specific impairment in VSTM processes (especially the storage stage) related to happy but not to angry faces.  相似文献   

18.
Detection of angry and happy faces is generally found to be easier and faster than that of faces expressing emotions other than anger or happiness. This can be explained by the threatening account and the feature account. Few empirical studies have explored the interaction between these two accounts which are seemingly, but not necessarily, mutually exclusive. The present studies hypothesised that prominent facial features are important in facilitating the detection process of both angry and happy expressions; yet the detection of happy faces was more facilitated by the prominent features than angry faces. Results confirmed the hypotheses and indicated that participants reacted faster to the emotional expressions with prominent features (in Study 1) and the detection of happy faces was more facilitated by the prominent feature than angry faces (in Study 2). The findings are compatible with evolutionary speculation which suggests that the angry expression is an alarming signal of potential threats to survival. Compared to the angry faces, the happy faces need more salient physical features to obtain a similar level of processing efficiency.  相似文献   

19.
Experiments using schematic faces developed by Öhman (Öhman, Lundqvist, &; Esteves, 2001) seem to document an anger-superiority effect, although we have come to question these experiments. Our work shows that the low-level features of these schematic faces interact with the face’s surround to produce effects that have been attributed to facial affect. Using relatively neutral faces that preserved the feature and surround spatial relationships of angry and happy schematic faces, we produced reaction times (RTs) that were indistinguishable from those found with angry and happy faces. We also found that the target face’s position within the crowd determined the magnitude of the advantage for angry faces as well as for relatively affect-neutral faces. Removing the facial surround reduces the advantage for angry faces, largely by improving performance on happy faces. There was an apparent small advantage for angry features without a surround. öhman faces avoid the problems associated with modified grayscale faces only to introduce an equally troubling confound.  相似文献   

20.
The present research demonstrates that the attention bias to angry faces is modulated by how people categorize these faces. Since facial expressions contain psychologically meaningful information for social categorizations (i.e., gender, personality) but not for non-social categorizations (i.e., eye-color), angry facial expression should especially capture attention during social categorization tasks. Indeed, in three studies, participants were slower to name the gender of angry compared to happy or neutral faces, but not their color (blue or green; Study 1) or eye-color (blue or brown; Study 2). Furthermore, when different eye-colors were linked to a personality trait (introversion, extraversion) versus sensitivity to light frequencies (high, low), angry faces only slowed down categorizations when eye-color was indicative of a social characteristic (Study 3). Thus, vigilance for angry facial expressions is contingent on people's categorization goals, supporting the perspective that even basic attentional processes are moderated by social influences.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号