首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The aim of the present study was to establish if patients with major depression (MD) exhibit a memory bias for sad faces, relative to happy and neutral, when the affective element of the faces is not explicitly processed at encoding. To this end, 16 psychiatric out-patients with MD and 18 healthy, never-depressed controls (HC) were presented with a series of emotional faces and were required to identify the gender of the individuals featured in the photographs. Participants were subsequently given a recognition memory test for these faces. At encoding, patients with MD exhibited a non-significant tendency towards slower gender identification (GI) times, relative to HC, for happy faces. However, the GI times of the two groups did not differ for sad or neutral faces. At memory testing, patients with MD did not exhibit the expected memory bias for sad faces. Similarly, HC did not demonstrate enhanced memory for happy faces. Overall, patients with MD were impaired in their memory for the faces relative to the HC. The current findings are consistent with the proposal that mood-congruent memory biases are contingent upon explicit processing of the emotional element of the to-be-remembered material at encoding.  相似文献   

2.
The present study examined whether information processing bias against emotional facial expressions is present among individuals with social anxiety. College students with high (high social anxiety group; n  = 26) and low social anxiety (low social anxiety group; n  = 26) performed three different types of working memory tasks: (a) ordering positive and negative facial expressions according to the intensity of emotion; (b) ordering pictures of faces according to age; and (c) ordering geometric shapes according to size. The high social anxiety group performed significantly more poorly than the low social anxiety group on the facial expression task, but not on the other two tasks with the nonemotional stimuli. These results suggest that high social anxiety interferes with processing of emotionally charged facial expressions.  相似文献   

3.
Findings from subjects with unilateral brain damage, as well as from normal subjects studied with tachistoscopic paradigms, argue that emotion is processed differently by each brain hemisphere. An open question concerns the extent to which such lateralised processing might occur under natural, freeviewing conditions. To explore this issue, we asked 28 normal subjects to discriminate emotions expressed by pairs of faces shown side-by-side, with no time or viewing constraints. Images of neutral expressions were shown paired with morphed images of very faint emotional expressions (happiness, surprise, disgust, fear, anger, or sadness). We found a surprising and robust laterality effect: When discriminating negative emotional expressions, subjects performed significantly better when the emotional face was to the left of the neutral face; conversely, when discriminating positive expressions, subjects performed better when the emotional face was to the right. We interpret this valence-specific laterality effect as consistent with the idea that the right hemisphere is specialised to process negative emotions, whereas the left is specialised to process positive emotions. The findings have important implications for how humans perceive facial emotion under natural conditions.  相似文献   

4.
This study identified components of attentional bias (e.g. attentional vigilance, attentional avoidance and difficulty with disengagement) that are critical characteristics of survivors of dating violence (DV). Eye movements were recorded to obtain accurate and continuous information regarding attention. DV survivors with high post-traumatic stress symptoms (DV-High PTSS group; n = 20) and low post-traumatic stress symptoms (DV-Low PTSS group; n = 22) and participants who had never experienced DV (NDV group; n = 21) were shown screens displaying emotional (angry, fearful and happy) faces paired with neutral faces and negative (angry and fearful) faces paired with happy faces for 10 s. The results indicate that the DV-High PTSS group spent longer dwelling on angry faces over time compared with the DV-Low PTSS and NDV groups. This result implies that the DV-High PTSS group focused on specific trauma-related stimuli but does not provide evidence of an attentional bias towards threatening stimuli in general.  相似文献   

5.
Memory biases toward threat have been documented in several anxiety disorders, but contradictory findings have recently been reported in social phobics' recognition of facial expressions. The present study examined recognition memory in clients with social phobia, in an effort to clarify previous inconsistent results. Just before giving a speech to a live audience, social phobia clients and normal controls viewed photographs of people with reassuring and threatening facial expressions. The stimuli were later presented again alongside photographs of the same person with a different facial expression, and participants chose which face they had seen before. Individuals with social phobia were less accurate at recognizing previously seen photographs than controls, apparently due to state anxiety. In contrast, social phobics did not show a memory bias toward threatening facial expressions. Theoretical and treatment implications are discussed.  相似文献   

6.
Unconscious facial reactions to emotional facial expressions   总被引:22,自引:0,他引:22  
Studies reveal that when people are exposed to emotional facial expressions, they spontaneously react with distinct facial electromyographic (EMG) reactions in emotion-relevant facial muscles. These reactions reflect, in part, a tendency to mimic the facial stimuli. We investigated whether corresponding facial reactions can be elicited when people are unconsciously exposed to happy and angry facial expressions. Through use of the backward-masking technique, the subjects were prevented from consciously perceiving 30-ms exposures of happy, neutral, and angry target faces, which immediately were followed and masked by neutral faces. Despite the fact that exposure to happy and angry faces was unconscious, the subjects reacted with distinct facial muscle reactions that corresponded to the happy and angry stimulus faces. Our results show that both positive and negative emotional reactions can be unconsciously evoked, and particularly that important aspects of emotional face-to-face communication can occur on an unconscious level.  相似文献   

7.
This study explored how rapidly emotion specific facial muscle reactions were elicited when subjects were exposed to pictures of angry and happy facial expressions. In three separate experiments, it was found that distinctive facial electromyographic reactions, i.e., greater Zygomaticus major muscle activity in response to happy than to angry stimuli and greater Corrugator supercilii muscle activity in response to angry stimuli, were detectable after only 300–400 ms of exposure. These findings demonstrate that facial reactions are quickly elicited, indicating that expressive emotional reactions can be very rapidly manifested and are perhaps controlled by fast operating facial affect programs.  相似文献   

8.
Current Psychology - Memory for others’ sad affective facial expressions may be relevant to depression risk, given that biases have been linked to major depression and transient sad mood...  相似文献   

9.
Ribeiro, L. A. & Fearon, P. (2010). Theory of mind and attentional bias to facial emotional expressions: A preliminary study. Scandinavian Journal of Psychology. Theory of mind ability has been associated with performance in interpersonal interactions and has been found to influence aspects such as emotion recognition, social competence, and social anxiety. Being able to attribute mental states to others requires attention to subtle communication cues such as facial emotional expressions. Decoding and interpreting emotions expressed by the face, especially those with negative valence, are essential skills to successful social interaction. The current study explored the association between theory of mind skills and attentional bias to facial emotional expressions. According to the study hypothesis, individuals with poor theory of mind skills showed preferential attention to negative faces over both non‐negative faces and neutral objects. Tentative explanations for the findings are offered emphasizing the potential adaptive role of vigilance for threat as a way of allocating a limited capacity to interpret others’ mental states to obtain as much information as possible about potential danger in the social environment.  相似文献   

10.
Facial expressions are critical for effective social communication, and as such may be processed by the visual system even when it might be advantageous to ignore them. Previous research has shown that categorising emotional words was impaired when faces of a conflicting valence were simultaneously presented. In the present study, we examined whether emotional word categorisation would also be impaired when faces of the same (negative) valence but different emotional category (either angry, sad or fearful) were simultaneously presented. Behavioural results provided evidence for involuntary processing of basic emotional facial expression category, with slower word categorisation when the face and word categories were incongruent (e.g., angry word and sad face) than congruent (e.g., angry word and angry face). Event-related potentials (ERPs) time-locked to the presentation of the word-face pairs also revealed that emotional category congruency effects were evident from approximately 170 ms after stimulus onset.  相似文献   

11.
Facial expressions are critical for effective social communication, and as such may be processed by the visual system even when it might be advantageous to ignore them. Previous research has shown that categorising emotional words was impaired when faces of a conflicting valence were simultaneously presented. In the present study, we examined whether emotional word categorisation would also be impaired when faces of the same (negative) valence but different emotional category (either angry, sad or fearful) were simultaneously presented. Behavioural results provided evidence for involuntary processing of basic emotional facial expression category, with slower word categorisation when the face and word categories were incongruent (e.g., angry word and sad face) than congruent (e.g., angry word and angry face). Event-related potentials (ERPs) time-locked to the presentation of the word–face pairs also revealed that emotional category congruency effects were evident from approximately 170 ms after stimulus onset.  相似文献   

12.
13.
Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers’ discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum “felt the same” or “felt different.” In the identification task, images were presented individually and participants were asked to label the emotion displayed on the face (e.g., “Does she look happy or sad?”). Results suggest that 3.5-year-olds have the same category boundary as adults. They were more likely to report that the image pairs felt “different” at the image pair that crossed the category boundary. These results suggest that 3.5-year-olds perceive happy and sad emotional facial expressions categorically as adults do. Categorizing emotional expressions is advantageous for children if it allows them to use social information faster and more efficiently.  相似文献   

14.
Adults perceive emotional expressions categorically, with discrimination being faster and more accurate between expressions from different emotion categories (i.e. blends with two different predominant emotions) than between two stimuli from the same category (i.e. blends with the same predominant emotion). The current study sought to test whether facial expressions of happiness and fear are perceived categorically by pre-verbal infants, using a new stimulus set that was shown to yield categorical perception in adult observers (Experiments 1 and 2). These stimuli were then used with 7-month-old infants (N = 34) using a habituation and visual preference paradigm (Experiment 3). Infants were first habituated to an expression of one emotion, then presented with the same expression paired with a novel expression either from the same emotion category or from a different emotion category. After habituation to fear, infants displayed a novelty preference for pairs of between-category expressions, but not within-category ones, showing categorical perception. However, infants showed no novelty preference when they were habituated to happiness. Our findings provide evidence for categorical perception of emotional expressions in pre-verbal infants, while the asymmetrical effect challenges the notion of a bias towards negative information in this age group.  相似文献   

15.
Detection of emotional facial expressions has been shown to be more efficient than detection of neutral expressions. However, it remains unclear whether this effect is attributable to visual or emotional factors. To investigate this issue, we conducted two experiments using the visual search paradigm with photographic stimuli. We included a single target facial expression of anger or happiness in presentations of crowds of neutral facial expressions. The anti-expressions of anger and happiness were also presented. Although anti-expressions produced changes in visual features comparable to those of the emotional facial expressions, they expressed relatively neutral emotions. The results consistently showed that reaction times (RTs) for detecting emotional facial expressions (both anger and happiness) were shorter than those for detecting anti-expressions. The RTs for detecting the expressions were negatively related to experienced emotional arousal. These results suggest that efficient detection of emotional facial expressions is not attributable to their visual characteristics but rather to their emotional significance.  相似文献   

16.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness–fear and anger–disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness–fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise–sadness and excitement–disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

17.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness-fear and anger-disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness-fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise-sadness and excitement-disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

18.
The present electromyographic study is a first step toward shedding light on the involvement of affective processes in congruent and incongruent facial reactions to facial expressions. Further, empathy was investigated as a potential mediator underlying the modulation of facial reactions to emotional faces in a competitive, a cooperative, and a neutral setting. Results revealed less congruent reactions to happy expressions and even incongruent reactions to sad and angry expressions in the competition condition, whereas virtually no differences between the neutral and the cooperation condition occurred. Effects on congruent reactions were found to be mediated by cognitive empathy, indicating that the state of empathy plays an important role in the situational modulation of congruent reactions. Further, incongruent reactions to sad and angry faces in a competition setting were mediated by the emotional reaction of joy, supporting the assumption that incongruent facial reactions are mainly based on affective processes. Additionally, strategic processes (specifically, the goal to create and maintain a smooth, harmonious interaction) were found to influence facial reactions while being in a cooperative mindset. Now, further studies are needed to test for the generalizability of these effects.  相似文献   

19.
Posers were requested to produce happy and sad emotional expressions, deliberately accentuated on the left and right sides of the face. Raters judged the emotional intensity of expressions when presented in original and mirror-reverse orientation. Left-side-accentuated sad expressions were rated as more intense than right-side-accentuated sad expressions. Raters were biased to judge expressions as more intense when the accentuated side was to their left. The findings indicated that the perceiver bias in weighting information from the side of the face in left hemispace extends to judgments of emotional intensity.  相似文献   

20.
Faces with expressions (happy, surprise, anger, fear) were presented at study. Memory for facial expressions was tested by presenting the same faces with neutral expressions and asking participants to determine the expression that had been displayed at study. In three experiments, happy expressions were remembered better than other expressions. The advantage of a happy face was observed even when faces were inverted (upside down) and even when the salient perceptual feature (broad grin) was controlled across conditions. These findings are couched in terms of source monitoring, in which memory for facial expressions reflects encoding of the dispositional context of a prior event.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号