首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This study explored how rapidly emotion specific facial muscle reactions were elicited when subjects were exposed to pictures of angry and happy facial expressions. In three separate experiments, it was found that distinctive facial electromyographic reactions, i.e., greater Zygomaticus major muscle activity in response to happy than to angry stimuli and greater Corrugator supercilii muscle activity in response to angry stimuli, were detectable after only 300–400 ms of exposure. These findings demonstrate that facial reactions are quickly elicited, indicating that expressive emotional reactions can be very rapidly manifested and are perhaps controlled by fast operating facial affect programs.  相似文献   

2.
We investigated whether emotional information from facial expression and hand movement quality was integrated when identifying the expression of a compound stimulus showing a static facial expression combined with emotionally expressive dynamic manual actions. The emotions (happiness, neutrality, and anger) expressed by the face and hands were either congruent or incongruent. In Experiment 1, the participants judged whether the stimulus person was happy, neutral, or angry. Judgments were mainly based on the facial expressions, but were affected by manual expressions to some extent. In Experiment 2, the participants were instructed to base their judgment on the facial expression only. An effect of hand movement expressive quality was observed for happy facial expressions. The results conform with the proposal that perception of facial expressions of emotions can be affected by the expressive qualities of hand movements.  相似文献   

3.
This study investigated whether sensitivity to and evaluation of facial expressions varied with repeated exposure to non-prototypical facial expressions for a short presentation time. A morphed facial expression was presented for 500 ms repeatedly, and participants were required to indicate whether each facial expression was happy or angry. We manipulated the distribution of presentations of the morphed facial expressions for each facial stimulus. Some of the individuals depicted in the facial stimuli expressed anger frequently (i.e., anger-prone individuals), while the others expressed happiness frequently (i.e., happiness-prone individuals). After being exposed to the faces of anger-prone individuals, the participants became less sensitive to those individuals’ angry faces. Further, after being exposed to the faces of happiness-prone individuals, the participants became less sensitive to those individuals’ happy faces. We also found a relative increase in the social desirability of happiness-prone individuals after exposure to the facial stimuli.  相似文献   

4.
Unconscious facial reactions to emotional facial expressions   总被引:22,自引:0,他引:22  
Studies reveal that when people are exposed to emotional facial expressions, they spontaneously react with distinct facial electromyographic (EMG) reactions in emotion-relevant facial muscles. These reactions reflect, in part, a tendency to mimic the facial stimuli. We investigated whether corresponding facial reactions can be elicited when people are unconsciously exposed to happy and angry facial expressions. Through use of the backward-masking technique, the subjects were prevented from consciously perceiving 30-ms exposures of happy, neutral, and angry target faces, which immediately were followed and masked by neutral faces. Despite the fact that exposure to happy and angry faces was unconscious, the subjects reacted with distinct facial muscle reactions that corresponded to the happy and angry stimulus faces. Our results show that both positive and negative emotional reactions can be unconsciously evoked, and particularly that important aspects of emotional face-to-face communication can occur on an unconscious level.  相似文献   

5.
Typical adults mimic facial expressions within 1000 ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study evaluated mechanisms underlying RFRs during childhood and examined possible impairment in children with ASD. Experiment 1 found RFRs to happy and angry faces (not fear faces) in 15 typically developing children from 7 to 12 years of age. RFRs of fear (not anger) in response to angry faces indicated an emotional mechanism. In 11 children (8-13 years of age) with ASD, Experiment 2 found undifferentiated RFRs to fear expressions and no consistent RFRs to happy or angry faces. However, as children with ASD aged, matching RFRs to happy faces increased significantly, suggesting the development of processes underlying matching RFRs during this period in ASD.  相似文献   

6.
Using a visual search paradigm, we investigated how a top-down goal modified attentional bias for threatening facial expressions. In two experiments, participants searched for a facial expression either based on stimulus characteristics or a top-down goal. In Experiment 1, participants searched for a discrepant facial expression in a homogenous crowd of faces. Consistent with previous research, we obtained a shallower response time (RT) slope when the target face was angry than when it was happy. In Experiment 2, participants searched for a specific type of facial expression (allowing a top-down goal). When the display included a target, we found a shallower RT slope for the angry than for the happy face search. However, when an angry or happy face was present in the display in opposition to the task goal, we obtained equivalent RT slopes, suggesting that the mere presence of an angry face in opposition to the task goal did not support the well-known angry face superiority effect. Furthermore, RT distribution analyses supported the special status of an angry face only when it was combined with the top-down goal. On the basis of these results, we suggest that a threatening facial expression may guide attention as a high-priority stimulus in the absence of a specific goal; however, in the presence of a specific goal, the efficiency of facial expression search is dependent on the combined influence of a top-down goal and the stimulus characteristics.  相似文献   

7.
We examined dysfunctional memory processing of facial expressions in relation to alexithymia. Individuals with high and low alexithymia, as measured by the Toronto Alexithymia Scale (TAS-20), participated in a visual search task (Experiment 1A) and a change-detection task (Experiments 1B and 2), to assess differences in their visual short-term memory (VSTM). In the visual search task, the participants were asked to judge whether all facial expressions (angry and happy faces) in the search display were the same or different. In the change-detection task, they had to decide whether all facial expressions changed between successive two displays. We found individual differences only in the change-detection task. Individuals with high alexithymia showed lower sensitivity for the happy faces compared to the angry faces, while individuals with low alexithymia showed sufficient recognition for both facial expressions. Experiment 2 examined whether individual differences were observed during early storage or later retrieval stage of the VSTM process using a single-probe paradigm. We found no effect of single-probe, indicating that individual differences occurred at the storage stage. The present results provide new evidence that individuals with high alexithymia show specific impairment in VSTM processes (especially the storage stage) related to happy but not to angry faces.  相似文献   

8.
The hypotheses of this investigation were derived by conceiving of automatic mimicking as a component of emotional empathy. Differences between subjects high and low in emotional empathy were investigated. The parameters compared were facial mimicry reactions, as represented by electromyographic (EMG) activity when subjects were exposed to pictures of angry or happy faces, and the degree of correspondence between subjects' facial EMG reactions and their self-reported feelings. The comparisons were made at different stimulus exposure times in order to elicit reactions at different levels of information processing. The high-empathy subjects were found to have a higher degree of mimicking behavior than the low-empathy subjects, a difference that emerged at short exposure times (17-40 ms) that represented automatic reactions. The low-empathy subjects tended already at short exposure times (17-40 ms) to show inverse zygomaticus muscle reactions, namely "smiling" when exposed to an angry face. The high-empathy group was characterized by a significantly higher correspondence between facial expressions and self-reported feelings. No differences were found between the high- and low-empathy subjects in their verbally reported feelings when presented a happy or an angry face. Thus, the differences between the groups in emotional empathy appeared to be related to differences in automatic somatic reactions to facial stimuli rather than to differences in their conscious interpretation of the emotional situation.  相似文献   

9.
Facial emotions are important for human communication. Unfortunately, traditional facial emotion recognition tasks do not inform about how respondents might behave towards others expressing certain emotions. Approach‐avoidance tasks do measure behaviour, but only on one dimension. In this study 81 participants completed a novel Facial Emotion Response Task. Images displaying individuals with emotional expressions were presented in random order. Participants simultaneously indicated how communal (quarrelsome vs. agreeable) and how agentic (dominant vs. submissive) they would be in response to each expression. We found that participants responded differently to happy, angry, fearful, and sad expressions in terms of both dimensions of behaviour. Higher levels of negative affect were associated with less agreeable responses specifically towards happy and sad expressions. The Facial Emotion Response Task might complement existing facial emotion recognition and approach‐avoidance tasks.  相似文献   

10.
Previous research suggests that neural and behavioral responses to surprised faces are modulated by explicit contexts (e.g., "He just found $500"). Here, we examined the effect of implicit contexts (i.e., valence of other frequently presented faces) on both valence ratings and ability to detect surprised faces (i.e., the infrequent target). In Experiment 1, we demonstrate that participants interpret surprised faces more positively when they are presented within a context of happy faces, as compared to a context of angry faces. In Experiments 2 and 3, we used the oddball paradigm to evaluate the effects of clearly valenced facial expressions (i.e., happy and angry) on default valence interpretations of surprised faces. We offer evidence that the default interpretation of surprise is negative, as participants were faster to detect surprised faces when presented within a happy context (Exp. 2). Finally, we kept the valence of the contexts constant (i.e., surprised faces) and showed that participants were faster to detect happy than angry faces (Exp. 3). Together, these experiments demonstrate the utility of the oddball paradigm to explore the default valence interpretation of presented facial expressions, particularly the ambiguously valenced facial expression of surprise.  相似文献   

11.
Event-related brain potentials were measured in 7- and 12-month-old infants to examine the development of processing happy and angry facial expressions. In 7-month-olds a larger negativity to happy faces was observed at frontal, central, temporal and parietal sites (Experiment 1), whereas 12-month-olds showed a larger negativity to angry faces at occipital sites (Experiment 2). These data suggest that processing of these facial expressions undergoes development between 7 and 12 months: while 7-month-olds exhibit heightened sensitivity to happy faces, 12-month-olds resemble adults in their heightened sensitivity to angry faces. In Experiment 3 infants' visual preference was assessed behaviorally, revealing that the differences in ERPs observed at 7 and 12 months do not simply reflect differences in visual preference.  相似文献   

12.
Much research on emotional facial expression employs posed expressions and expressive subjects. To test the generalizability of this research to more spontaneous expressions of both expressive and nonexpressive posers, subjects engaged in happy, sad, angry, and neutral imagery, and voluntarily posed happy, sad, and angry facial expressions while facial muscle activity (brow, cheek, and mouth regions) and autonomic activity (skin resistance and heart period) were recorded. Subjects were classified as expressive or nonexpressive on the basis of the intensity of their posed expressions. The posed and imagery-induced expressions were similar, but not identical. Brow activity present in the imagery-induced sad expressions was weak or absent in the posed ones. Both nonexpressive and expressive subjects demonstrated similar heart rate acceleration during emotional imagery and demonstrated similar posed and imagery-induced happy expressions, but nonexpressive subjects showed little facial activity during both their posed and imagery-induced sad and angry expressions. The implications of these findings are discussed.  相似文献   

13.
This study assessed the speed of recognition of facial emotional expressions (happy and angry) as a function of violent video game play. Color photos of calm facial expressions morphed to either an angry or a happy facial expression. Participants were asked to make a speeded identification of the emotion (happiness or anger) during the morph. Typically, happy faces are identified faster than angry faces (the happy-face advantage). Results indicated that playing a violent video game led to a reduction in the happy face advantage. Implications of these findings are discussed with respect to the current models of aggressive behavior.  相似文献   

14.
Facial EMG activity was measured from the Corrugator supercilii and the Zygomatic major muscle regions while 48 subjects were exposed to pictures of angry and happy facial expressions, snakes and flowers as well as low and high preference nature scenes. The valency perspective predicted that facial reactions should be related to the intensity of the positive and the negative valency of stimuli. The mimicking behavior approach predicted that facial reactions should only be reflected as a mimicking response to the facial stimuli, whereas the evolutionary:biological perspective predicted that the most clearcut positive and negative facial reactions should be evoked by facial stimuli and by snakes. In support of the latter perspective, the present results showed that angry faces and snakes evoked the most distinct Corrugator supercilii muscle response, whereas happy faces evoked the largest Zygomatic major muscle response.  相似文献   

15.
The interdependent motives of cooperation and competition are integral to adaptive social functioning. In three experiments, we provide novel evidence that both cooperation and competition goals enhance perceptual acuity for both angry and happy faces. Experiment 1 found that both cooperative and competitive motives improve perceivers?? ability to discriminate between genuine and deceptive smiles. Experiment 2 revealed that both cooperative and competitive motives improve perceivers?? perceptual sensitivity to subtle differences among happy and angry facial expressions. Finally, Experiment 3 found that the motivated increase in perceptual acuity for happy and angry expressions allows perceivers to overcome the effects of visual noise, relative to unmotivated control participants. Collectively, these results provide novel evidence that the interdependent motives of cooperation and competition can attune visual perception, accentuating the subjectively experienced signal strength of anger and happiness.  相似文献   

16.
We systematically examined the impact of emotional stimuli on time perception in a temporal reproduction paradigm where participants reproduced the duration of a facial emotion stimulus using an oval-shape stimulus or vice versa. Experiment 1 asked participants to reproduce the duration of an angry face (or the oval) presented for 2,000 ms. Experiment 2 included a range of emotional expressions (happy, sad, angry, and neutral faces as well as the oval stimulus) presented for different durations (500, 1,500, and 2,000 ms). We found that participants over-reproduced the durations of happy and sad faces using the oval stimulus. By contrast, there was a trend of under-reproduction when the duration of the oval stimulus was reproduced using the angry face. We suggest that increased attention to a facial emotion produces the relativity of time perception.  相似文献   

17.
The current longitudinal study (N = 107) examined mothers’ facial emotion recognition using reaction time and their infants’ affect-based attention at 5, 7, and 14 months of age using eyetracking. Our results, examining maternal and infant responses to angry, fearful and happy facial expressions, show that only maternal responses to angry facial expressions were robustly and positively linked across time points, indexing a consistent trait-like response to social threat among mothers. However, neither maternal responses to happy or fearful facial expressions nor infant responses to all three facial emotions show such consistency, pointing to the changeable nature of facial emotion processing, especially among infants. In general, infants’ attention toward negative emotions (i.e., angry and fear) at earlier timepoints was linked to their affect-biased attention for these emotions at 14 months but showed greater dynamic change across time. Moreover, our results provide limited evidence for developmental continuity in processing negative emotions and for the bidirectional interplay of infant affect-biased attention and maternal facial emotion recognition. This pattern of findings suggests that infants’ affect-biased attention to facial expressions of emotion are characterized by dynamic changes.  相似文献   

18.
Hostility is associated with biases in the perception of emotional facial expressions, such that ambiguous or neutral expressions tend to be perceived as threatening or angry. In this study, the effects of hostility and gender on the perception of angry, neutral, and happy faces and on the oscillatory dynamics of cortical responses elicited by these presentations were investigated using time–frequency decomposition by means of wavelet transforms. Feelings of hostility predisposed subjects to perceive happy and neutral faces as less friendly. This effect was more pronounced in women. In hostile subjects, presentation of emotional facial expressions also evoked stronger posterior synchronization in the theta and diminished desynchronization in the alpha band. This may signify a prevalence of emotional responding over cognitive processing. These effects were also more pronounced in females. Hostile females, but not hostile males, additionally showed a widespread synchronization in the alpha band. This synchronization is tentatively explained as a manifestation of inhibitory control which is present in aggressive females, but not in aggressive males. Aggr. Behav. 35:502–513, 2009. © 2009 Wiley‐Liss, Inc.  相似文献   

19.
采用类别知觉情绪识别范式,考察高、低羞怯儿童对快乐-愤怒和快乐-悲伤模糊情绪面孔的知觉偏差和知觉敏感性。结果发现:(1)相对于低羞怯儿童,高羞怯儿童倾向于将快乐-愤怒模糊情绪面孔知觉为愤怒,将快乐-悲伤模糊情绪面孔知觉为悲伤;(2)两组儿童在快乐-愤怒、快乐-悲伤模糊情绪面孔类别界线处的斜率差异均不显著。研究表明高羞怯儿童具有敌意归因偏向和更高的悲伤共情反应,而对快乐-愤怒和快乐-悲伤表情的类别转变不敏感。  相似文献   

20.
采用类别知觉情绪识别范式,考察高、低羞怯儿童对快乐-愤怒和快乐-悲伤模糊情绪面孔的知觉偏差和知觉敏感性。结果发现:(1)相对于低羞怯儿童,高羞怯儿童倾向于将快乐-愤怒模糊情绪面孔知觉为愤怒,将快乐-悲伤模糊情绪面孔知觉为悲伤;(2)两组儿童在快乐-愤怒、快乐-悲伤模糊情绪面孔类别界线处的斜率差异均不显著。研究表明高羞怯儿童具有敌意归因偏向和更高的悲伤共情反应,而对快乐-愤怒和快乐-悲伤表情的类别转变不敏感。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号