首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This study explored how rapidly emotion specific facial muscle reactions were elicited when subjects were exposed to pictures of angry and happy facial expressions. In three separate experiments, it was found that distinctive facial electromyographic reactions, i.e., greater Zygomaticus major muscle activity in response to happy than to angry stimuli and greater Corrugator supercilii muscle activity in response to angry stimuli, were detectable after only 300–400 ms of exposure. These findings demonstrate that facial reactions are quickly elicited, indicating that expressive emotional reactions can be very rapidly manifested and are perhaps controlled by fast operating facial affect programs.  相似文献   

2.
Infants’ ability to discriminate emotional facial expressions and tones of voice is well-established, yet little is known about infant discrimination of emotional body movements. Here, we asked if 10–20-month-old infants rely on high-level emotional cues or low-level motion related cues when discriminating between emotional point-light displays (PLDs). In Study 1, infants viewed 18 pairs of angry, happy, sad, or neutral PLDs. Infants looked more at angry vs. neutral, happy vs. neutral, and neutral vs. sad. Motion analyses revealed that infants preferred the PLD with more total body movement in each pairing. Study 2, in which infants viewed inverted versions of the same pairings, yielded similar findings except for sad-neutral. Study 3 directly paired all three emotional stimuli in both orientations. The angry and happy stimuli did not significantly differ in terms of total motion, but both had more motion than the sad stimuli. Infants looked more at angry vs. sad, more at happy vs. sad, and about equally to angry vs. happy in both orientations. Again, therefore, infants preferred PLDs with more total body movement. Overall, the results indicate that a low-level motion preference may drive infants’ discrimination of emotional human walking motions.  相似文献   

3.
The study examined self-reported emotion and facial muscle and autonomic activity of depressed and nondepressed men in response to the social context of emotional situations. 20 university men, assessed on the Beck Depression Inventory, were asked to imagine happy and sad situations with and without visualizing other people. No differences were found between men classified as depressed and nondepressed on self-reported emotion and facial muscle activity. Smiling did not show differences between social contexts although self-reported happiness was increased during happy-social compared to happy-solitary imagery. Adjusting smiling for social context differences in happiness showed less smiling during happy-social than during happy-solitary imagery. In contrast, self-reported sadness and frowning were greater during sad-social compared to sad-solitary imagery. No differences between social contexts were found when frowning was adjusted for social context differences in sadness. Depressed-scoring men showed higher mean heart rate during sad-social than sad-solitary imagery whereas nondepressed-scoring men showed higher mean heart rate during happy social compared to happy-solitary imagery. The results indicate that men may frown more when sad but generally do not smile more during happy-social imagery, independent of depression. Depressed mood may affect heart rate during sad imagery but may not alter facial muscle activity and self:reported emotion in men.  相似文献   

4.
采用类别知觉情绪识别范式,考察高、低羞怯儿童对快乐-愤怒和快乐-悲伤模糊情绪面孔的知觉偏差和知觉敏感性。结果发现:(1)相对于低羞怯儿童,高羞怯儿童倾向于将快乐-愤怒模糊情绪面孔知觉为愤怒,将快乐-悲伤模糊情绪面孔知觉为悲伤;(2)两组儿童在快乐-愤怒、快乐-悲伤模糊情绪面孔类别界线处的斜率差异均不显著。研究表明高羞怯儿童具有敌意归因偏向和更高的悲伤共情反应,而对快乐-愤怒和快乐-悲伤表情的类别转变不敏感。  相似文献   

5.
采用类别知觉情绪识别范式,考察高、低羞怯儿童对快乐-愤怒和快乐-悲伤模糊情绪面孔的知觉偏差和知觉敏感性。结果发现:(1)相对于低羞怯儿童,高羞怯儿童倾向于将快乐-愤怒模糊情绪面孔知觉为愤怒,将快乐-悲伤模糊情绪面孔知觉为悲伤;(2)两组儿童在快乐-愤怒、快乐-悲伤模糊情绪面孔类别界线处的斜率差异均不显著。研究表明高羞怯儿童具有敌意归因偏向和更高的悲伤共情反应,而对快乐-愤怒和快乐-悲伤表情的类别转变不敏感。  相似文献   

6.
The authors investigated the ability of children with emotional and behavioral difficulties, divided according to their Psychopathy Screening Device scores (P. J. Frick & R. D. Hare, in press), to recognize emotional facial expressions and vocal tones. The Psychopathy Screening Device indexes a behavioral syndrome with two dimensions: affective disturbance and impulsive and conduct problems. Nine children with psychopathic tendencies and 9 comparison children were presented with 2 facial expression and 2 vocal tone subtests from the Diagnostic Analysis of Nonverbal Accuracy (S. Nowicki & M. P. Duke, 1994). These subtests measure the ability to name sad, fearful, happy, and angry facial expressions and vocal affects. The children with psychopathic tendencies showed selective impairments in the recognition of both sad and fearful facial expressions and sad vocal tone. In contrast, the two groups did not differ in their recognition of happy or angry facial expressions or fearful, happy, and angry vocal tones. The results are interpreted with reference to the suggestion that the development of psychopathic tendencies may reflect early amygdala dysfunction (R. J. R. Blair, J. S. Morris, C. D. Frith, D. I. Perrett, & R. Dolan, 1999).  相似文献   

7.
The authors investigated the ability of children with emotional and behavioral difficulties, divided according to their Psychopathy Screening Device scores (P. J. Frick & R. D. Hare, in press), to recognize emotional facial expressions and vocal tones. The Psychopathy Screening Device indexes a behavioral syndrome with two dimensions: affective disturbance and impulsive and conduct problems. Nine children with psychopathic tendencies and 9 comparison children were presented with 2 facial expression and 2 vocal tone subtests from the Diagnostic Analysis of Nonverbal Accuracy (S. Nowicki & M. P. Duke, 1994). These subtests measure the ability to name sad, fearful, happy, and angry facial expressions and vocal affects. The children with psychopathic tendencies showed selective impairments in the recognition of both sad and fearful facial expressions and sad vocal tone. In contrast, the two groups did not differ in their recognition of happy or angry facial expressions or fearful, happy, and angry vocal tones. The results are interpreted with reference to the suggestion that the development of psychopathic tendencies may reflect early amygdala dysfunction (R. J. R. Blair, J. S. Morris, C. D. Frith, D. I. Perrett, & R. Dolan, 1999).  相似文献   

8.
Former research demonstrated that depression is associated with dysfunctional attentional processing of emotional information. Most studies examined this bias by registration of response latencies. The present study employed an ecologically valid measurement of attentive processing, using eye-movement registration. Dysphoric and non-dysphoric participants viewed slides presenting sad, angry, happy and neutral facial expressions. For each type of expression, three components of visual attention were analysed: the relative fixation frequency, fixation time and glance duration. Attentional biases were also investigated for inverted facial expressions to ensure that they were not related to eye-catching facial features. Results indicated that non-dysphoric individuals were characterised by longer fixating and dwelling on happy faces. Dysphoric individuals demonstrated a longer dwelling on sad and neutral faces. These results were not found for inverted facial expressions. The present findings are in line with the assumption that depression is associated with a prolonged attentional elaboration on negative information.  相似文献   

9.
Unconscious facial reactions to emotional facial expressions   总被引:22,自引:0,他引:22  
Studies reveal that when people are exposed to emotional facial expressions, they spontaneously react with distinct facial electromyographic (EMG) reactions in emotion-relevant facial muscles. These reactions reflect, in part, a tendency to mimic the facial stimuli. We investigated whether corresponding facial reactions can be elicited when people are unconsciously exposed to happy and angry facial expressions. Through use of the backward-masking technique, the subjects were prevented from consciously perceiving 30-ms exposures of happy, neutral, and angry target faces, which immediately were followed and masked by neutral faces. Despite the fact that exposure to happy and angry faces was unconscious, the subjects reacted with distinct facial muscle reactions that corresponded to the happy and angry stimulus faces. Our results show that both positive and negative emotional reactions can be unconsciously evoked, and particularly that important aspects of emotional face-to-face communication can occur on an unconscious level.  相似文献   

10.
37 subjects' facial electromyographic activity at the corrugator and zygomatic muscle regions were recorded while they were posing with happy and sad facial expressions. Analysis showed that the mean value of EMG activity at the left zygomatic muscle region was the highest, followed by the right zygomatic, left corrugator, and right corrugator muscle regions, while a happy facial expression was posed. The mean value of EMG activity at the left corrugator muscle region was the highest, followed by those for the right corrugator, left zygomatic, and right zygomatic muscle regions while a sad facial expression was posed. Further analysis indicated that the power of facial EMG activity on the left side of the face was stronger than on the right side of the face while posing both happy and sad expressions.  相似文献   

11.
We investigated the source of the visual search advantage of some emotional facial expressions. An emotional face target (happy, surprised, disgusted, fearful, angry, or sad) was presented in an array of neutral faces. A faster detection was found for happy targets, with angry and, especially, sad targets being detected more poorly. Physical image properties (e.g., luminance) were ruled out as a potential source of these differences in visual search. In contrast, the search advantage is partly due to the facilitated processing of affective content, as shown by an emotion identification task. Happy expressions were identified faster than the other expressions and were less likely to be confounded with neutral faces, whereas misjudgements occurred more often for angry and sad expressions. Nevertheless, the distinctiveness of some local features (e.g., teeth) that are consistently associated with emotional expressions plays the strongest role in the search advantage pattern. When the contribution of these features to visual search was factored out statistically, the advantage disappeared.  相似文献   

12.
The current study investigated 6-, 9- and 12-month old infants’ ability to categorically perceive facial emotional expressions depicting faces from two continua: happy–sad and happy–angry. In a between-subject design, infants were tested on their ability to discriminate faces that were between-category (across the category boundary) or within-category (within emotion category). Results suggest that 9- and 12 month-olds can discriminate between but not within categories, for the happy–angry continuum. Infants could not discriminate between cross-boundary facial expressions in the happy–sad continuum at any age. We suggest a functional account; categorical perception may develop in conjunction with the emotion's relevance to the infant.  相似文献   

13.
We investigated whether emotional information from facial expression and hand movement quality was integrated when identifying the expression of a compound stimulus showing a static facial expression combined with emotionally expressive dynamic manual actions. The emotions (happiness, neutrality, and anger) expressed by the face and hands were either congruent or incongruent. In Experiment 1, the participants judged whether the stimulus person was happy, neutral, or angry. Judgments were mainly based on the facial expressions, but were affected by manual expressions to some extent. In Experiment 2, the participants were instructed to base their judgment on the facial expression only. An effect of hand movement expressive quality was observed for happy facial expressions. The results conform with the proposal that perception of facial expressions of emotions can be affected by the expressive qualities of hand movements.  相似文献   

14.
This study examined whether 4‐month‐olds (N = 40) could perceptually categorize happy and angry faces, and show appropriate behavior in response to these faces. During the habituation phase, infants were shown the same type of facial expressions (happy or angry) posed by three models, and their behavior in response to those faces was observed. During the test phase immediately after the habituation phase, infants saw a novel emotional expression and a familiar expression posed by a new model, and their looking times were measured. The results indicated that, although 4‐month‐olds could perceptually categorize happy and angry faces accurately, they responded positively to both expression types. These findings suggest that, although infants can perceptually categorize facial expressions at 4 months of age, they require further time to learn the affective meanings of the facial expressions.  相似文献   

15.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed.  相似文献   

16.
A signal-detection task was used to assess sex differences in emotional face recognition under conditions of uncertainty. Computer images of Ekman faces showing sad, angry, happy, and fearful emotional states were presented for 50 ms to thirty-six men and thirty-seven women. All participants monitored for presentation of either happy, angry, or sad emotional expressions in three separate blocks. Happy faces were the most easily discriminated. Sad and angry expressions were most often mistaken for each other. Analyses of d' values, hit rates, and reaction times all yielded similar results, with no sex differences for any of the measures.  相似文献   

17.
We examined proactive and reactive control effects in the context of task-relevant happy, sad, and angry facial expressions on a face-word Stroop task. Participants identified the emotion expressed by a face that contained a congruent or incongruent emotional word (happy/sad/angry). Proactive control effects were measured in terms of the reduction in Stroop interference (difference between incongruent and congruent trials) as a function of previous trial emotion and previous trial congruence. Reactive control effects were measured in terms of the reduction in Stroop interference as a function of current trial emotion and previous trial congruence. Previous trial negative emotions exert greater influence on proactive control than the positive emotion. Sad faces in the previous trial resulted in greater reduction in the Stroop interference for happy faces in the current trial. However, current trial angry faces showed stronger adaptation effects compared to happy faces. Thus, both proactive and reactive control mechanisms are dependent on emotional valence of task-relevant stimuli.  相似文献   

18.
The purpose of this study is to explore whether subjects exposed to stimuli of facial expressions respond with facial electromyographic (EMG) reactions consistent with the hypothesis that facial expressions are contagious. This study further examines whether males and females differ in facial EMG intensity. Two experiments demonstrated that subjects responded with facial EMG activity over the corrugator supercilii, the zygomatic major , the lateral frontalis , the depressor supercilii , and the levator labii muscle regions to stimuli of sad, angry, fearful, surprised, disgusted and happy faces, that, to large extent, were consistent with the hypothesis that facial expressions are contagious. Aspects of gender differences reported in earlier studies were found, indicating a tendency for females to respond with more pronounced facial EMG intensity.  相似文献   

19.
We systematically examined the impact of emotional stimuli on time perception in a temporal reproduction paradigm where participants reproduced the duration of a facial emotion stimulus using an oval-shape stimulus or vice versa. Experiment 1 asked participants to reproduce the duration of an angry face (or the oval) presented for 2,000 ms. Experiment 2 included a range of emotional expressions (happy, sad, angry, and neutral faces as well as the oval stimulus) presented for different durations (500, 1,500, and 2,000 ms). We found that participants over-reproduced the durations of happy and sad faces using the oval stimulus. By contrast, there was a trend of under-reproduction when the duration of the oval stimulus was reproduced using the angry face. We suggest that increased attention to a facial emotion produces the relativity of time perception.  相似文献   

20.
为探讨高特质焦虑者在前注意阶段对情绪刺激的加工模式以明确其情绪偏向性特点, 本研究采用偏差-标准反转Oddball范式探讨了特质焦虑对面部表情前注意加工的影响。结果发现: 对于低特质焦虑组, 悲伤面孔所诱发的早期EMMN显著大于快乐面孔, 而对于高特质焦虑组, 快乐和悲伤面孔所诱发的早期EMMN差异不显著。并且, 高特质焦虑组的快乐面孔EMMN波幅显著大于低特质焦虑组。结果表明, 人格特质是影响面部表情前注意加工的重要因素。不同于普通被试, 高特质焦虑者在前注意阶段对快乐和悲伤面孔存在相类似的加工模式, 可能难以有效区分快乐和悲伤情绪面孔。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号