首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Facial expressions are crucial to human social communication, but the extent to which they are innate and universal versus learned and culture dependent is a subject of debate. Two studies explored the effect of culture and learning on facial expression understanding. In Experiment 1, Japanese and U.S. participants interpreted facial expressions of emotion. Each group was better than the other at classifying facial expressions posed by members of the same culture. In Experiment 2, this reciprocal in-group advantage was reproduced by a neurocomputational model trained in either a Japanese cultural context or an American cultural context. The model demonstrates how each of us, interacting with others in a particular cultural context, learns to recognize a culture-specific facial expression dialect.  相似文献   

2.
Facial emotions are important for human communication. Unfortunately, traditional facial emotion recognition tasks do not inform about how respondents might behave towards others expressing certain emotions. Approach‐avoidance tasks do measure behaviour, but only on one dimension. In this study 81 participants completed a novel Facial Emotion Response Task. Images displaying individuals with emotional expressions were presented in random order. Participants simultaneously indicated how communal (quarrelsome vs. agreeable) and how agentic (dominant vs. submissive) they would be in response to each expression. We found that participants responded differently to happy, angry, fearful, and sad expressions in terms of both dimensions of behaviour. Higher levels of negative affect were associated with less agreeable responses specifically towards happy and sad expressions. The Facial Emotion Response Task might complement existing facial emotion recognition and approach‐avoidance tasks.  相似文献   

3.
韦程耀  赵冬梅 《心理科学进展》2012,20(10):1614-1622
近年来面部表情的跨文化研究显示出更多的跨文化一致性和差异性证据。自发面部表情的表达和识别、组内优势效应以及面部表情信息的上下不对称性已成为该领域的研究热点。方言理论、中国民间模型和EMPATH模型从三种不同的角度对面部表情跨文化研究的结果进行了理论解释。而表达规则和解码规则以及语言效应是面部表情跨文化表达与识别的重要影响因素。今后, 面部表情跨文化的表达和识别研究应更加关注面部表情特征信息和影响因素这两个方面。  相似文献   

4.
The widespread supposition that aspects of facial communication are uncontrollable and can betray a deceiver's true emotion has received little empirical attention. We examined the presence of inconsistent emotional expressions and "microexpressions" (1/25-1/5 of a second) in genuine and deceptive facial expressions. Participants viewed disgusting, sad, frightening, happy, and neutral images, responding to each with a genuine or deceptive (simulated, neutralized, or masked) expression. Each 1/30-s frame (104,550 frames in 697 expressions) was analyzed for the presence and duration of universal expressions, microexpressions, and blink rate. Relative to genuine emotions, masked emotions were associated with more inconsistent expressions and an elevated blink rate; neutralized emotions showed a decreased blink rate. Negative emotions were more difficult to falsify than happiness. Although untrained observers performed only slightly above chance at detecting deception, inconsistent emotional leakage occurred in 100% of participants at least once and lasted longer than the current definition of a microexpression suggests. Microexpressions were exhibited by 21.95% of participants in 2% of all expressions, and in the upper or lower face only.  相似文献   

5.
Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals.  相似文献   

6.
Facial asymmetry in posed and spontaneous expressions of emotion   总被引:1,自引:0,他引:1  
Patterns of facial asymmetry (i.e., extent of movement) as a function of elicitation condition, emotional valence, and sex of subjects are examined. Thirty-seven right-handed adult males and females were videotaped making positive and negative expressions of emotion under posed (verbal, visual) and spontaneous conditions. There were no differences in facial asymmetry as a function of condition. Overall, expressions were significantly left-sided, a finding implicating the right hemisphere. When sex and valence were considered, negative expressions were left-sided for all subjects, while positive expressions were left-sided for males only. Further, positive expressions were significantly less lateralized than negative ones for females. Measures of hemiface mobility and ocular dominance did not mediate these patterns of facial lateralization.  相似文献   

7.
The effects of Parkinson's disease (PD) on spontaneous and posed facial activity and on the control of facial muscles were assessed by comparing 22 PD patients with 22 controls. Facial activity was analysed using the Facial Action Coding System (FACS; Ekman & Friesen, 1978). As predicted, PD patients showed reduced levels of spontaneous and posed facial expression in reaction to unpleasant odours compared to controls. PD patients were less successful than controls in masking or intensifying negative facial expressions. PD patients were also less able than controls to imitate specific facial muscle movements, but did not differ in the ability to pose emotional facial expressions. These results suggest that not only is spontaneous facial activity disturbed in PD, but also to some degree the ability to pose facial expressions, to mask facial expressions with other expressions, and to deliberately move specific muscles in the face.  相似文献   

8.
The objectives of this study were to propose a method of presenting dynamic facial expressions to experimental subjects, in order to investigate human perception of avatar's facial expressions of different levels of emotional intensity. The investigation concerned how perception varies according to the strength of facial expression, as well as according to an avatar's gender. To accomplish these goals, we generated a male and a female virtual avatar with five levels of intensity of happiness and anger using a morphing technique. We then recruited 16 normal healthy subjects and measured each subject's emotional reaction by scoring affective arousal and valence after showing them the avatar's face. Through this study, we were able to investigate human perceptual characteristics evoked by male and female avatars' graduated facial expressions of happiness and anger. In addition, we were able to identify that a virtual avatar's facial expression could affect human emotion in different ways according to the avatar's gender and the intensity of its facial expressions. However, we could also see that virtual faces have some limitations because they are not real, so subjects recognized the expressions well, but were not influenced to the same extent. Although a virtual avatar has some limitations in conveying its emotion using facial expressions, this study is significant in that it shows that a new potential exists to use or manipulate emotional intensity by controlling a virtual avatar's facial expression linearly using a morphing technique. Therefore, it is predicted that this technique may be used for assessing emotional characteristics of humans, and may be of particular benefit for work with people with emotional disorders through a presentation of dynamic expression of various emotional intensities.  相似文献   

9.
It is common scientific knowledge, that most of what we say within a conversation is not only expressed by the words' meaning alone, but also through our gestures, postures, and body movements. This non-verbal mode is possibly rooted firmly in our human evolutionary heritage, and as such, some scientists argue that it serves as a fundamental assessment and expression tool for our inner qualities. Studies of nonverbal communication have established that a universal, culture-free, non-verbal sign system exists, that is available to all individuals for negotiating social encounters. Thus, it is not only the kind of gestures and expressions humans use in social communication, but also the way these movements are performed, as this seems to convey key information about an individual's quality. Dance, for example, is a special form of movement, which can be observed in human courtship displays. Recent research suggests that people are sensitive to the variation in dance movements, and that dance performance provides information about an individual's mate quality in terms of health and strength. This article reviews the role of body movement in human non-verbal communication, and highlights its significance in human mate preferences in order to promote future work in this research area within the evolutionary psychology framework.  相似文献   

10.
Facial expression and emotional stimuli were varied orthogonally in a 3 x 4 factorial design in order to test whether facial expression is necessary or sufficient to influence emotional experience. Subjects watched a film eliciting fear, sadness, or no emotion, while holding their facial muscles in the position characteristic of fear or sadness, or in an effortful but nonemotional grimace; those in a fourth group received no facial instructions. The subjects believed that the study concerned subliminal perception and that the facial positions were necessary to prevent physiological recording artifacts. The films had powerful effects on reported emotions, the facial expressions none. Correlations between facial expression and reported emotion were zero. Sad and fearful subjects showed distinctive patterns of physiological arousal. Facial expression also tended to affect physiological responses in a manner consistent with an effort hypothesis.  相似文献   

11.
The relationship between knowledge of American Sign Language (ASL) and the ability to encode facial expressions of emotion was explored. Participants were 55 college students, half of whom were intermediate-level students of ASL and half of whom had no experience with a signed language. In front of a video camera, participants posed the affective facial expressions of happiness, sadness, fear, surprise, anger, and disgust. These facial expressions were randomized onto stimulus tapes that were then shown to 60 untrained judges who tried to identify the expressed emotions. Results indicated that hearing subjects knowledgeable in ASL were generally more adept than were hearing nonsigners at conveying emotions through facial expression. Results have implications for better understanding the nature of nonverbal communication in hearing and deaf individuals.  相似文献   

12.
Since the introduction of empirical methods for studying facial expression, the interpretation of infant facial expressions has generated much debate. The premise of this article is that action tendencies of approach and withdrawal constitute a core organizational feature of emotion in humans, promoting coherence of behavior, facial signaling, and physiological responses. The approach/withdrawal framework can provide a taxonomy of contexts and the neurobehavioral framework for the systematic, empirical study of individual differences in expression, physiology, and behavior within individuals as well as across contexts over time. By adopting this framework in developmental work on basic emotion processes, it may be possible to better understand the behavioral principles governing facial displays, and how individual differences in them are related to physiology and behavioral function in context.  相似文献   

13.
Facial expressions convey not only emotions but also communicative information. Therefore, facial expressions should be analysed to understand communication. The objective of this study is to develop an automatic facial expression analysis system for extracting nonverbal communicative information. This study focuses on specific communicative information: emotions expressed through facial movements and the direction of the expressions. We propose a multi-tasking deep convolutional network (DCN) to classify facial expressions, detect the facial regions, and estimate face angles. We reformulate facial region detection and face angle estimation as regression problems and add task-specific output layers in the DCN’s architecture. Experimental results show that the proposed method performs all tasks accurately. In this study, we show the feasibility of the multi-tasking DCN for extracting nonverbal communicative information from a human face.  相似文献   

14.
Few studies have investigated how physical and social facial cues are integrated in the formation of face preferences. Here we show that expression differentially qualifies the strength of attractiveness preferences for faces with direct and averted gaze. For judgments of faces with direct gaze, attractiveness preferences were stronger for smiling faces than for faces with neutral expressions. By contrast, for judgments of faces with averted gaze, attractiveness preferences were stronger for faces with neutral expressions than for smiling faces. Because expressions can differ in meaning depending on whether they are directed toward or away from oneself, it is only by integrating gaze direction, facial expression, and physical attractiveness that one can unambiguously identify the most attractive individuals who are likely to reciprocate one's own social interest.  相似文献   

15.
为揭示高特质攻击个体对愤怒、恐惧威胁面部表情识别的特点及其电生理机制,本研究采用Buss-Perry攻击问卷选取高低特质攻击个体26名和27名为被试,采用面孔识别范式对高低特质攻击个体识别威胁面部表情时的ERP差异进行研究。结果发现,在愤怒、恐惧表情上,高特质攻击组在N170成分的潜伏期都显著短于低特质攻击组;在愤怒、恐惧表情上,高特质攻击组在P200成分的波幅都显著高于低特质攻击组。这表明高特质攻击个体对愤怒、恐惧威胁面部表情的识别具有高度敏感性,这种敏感性体现在面部表情识别的早期和中期阶段,而非晚期阶段,即高特质攻击个体在早期的前注意阶段就对愤怒、恐惧威胁面部表情进行优先注意;在中期的注意阶段,高特质攻击个体可以很好地确认愤怒、恐惧威胁面部表情。  相似文献   

16.
It is often assumed that intimacy and familiarity will lead to better and more effective emotional communication between two individuals. However, research has failed to unequivocally support this claim. The present study proposes that close dyads exhibit superiority in the decoding of subdued facial cues than in the decoding of highly intense expressions. A total of 43 close friend dyads and 49 casual acquaintance dyads (all women) were compared on their recognition of their partner's and a stranger's subdued facial expressions. Dyadic analyses indicate that close friends were more accurate and also improved more rapidly than casual acquaintances in decoding one other's subdued expressions of sadness, anger, and happiness, especially the two negative emotions, but not in detecting the stranger's subdued expressions. The results strongly suggest that intimacy fosters more accurate decoding of subdued facial expressions.  相似文献   

17.
Deception has been reported to be influenced by task-relevant emotional information from an external stimulus. However, it remains unclear how task-irrelevant emotional information would influence deception. In the present study, facial expressions of different valence and emotion intensity were presented to participants, where they were asked to make either truthful or deceptive gender judgments according to the preceding cues. We observed the influence of facial expression intensity upon individuals’ cognitive cost of deceiving (mean difference of individuals’ truthful and deceptive response times). Larger cost was observed for high intensity faces compared to low intensity faces. These results provided insights on how automatic attraction of attention evoked by task-irrelevant emotional information in facial expressions influenced individuals’ cognitive cost of deceiving.  相似文献   

18.
Ribeiro, L. A. & Fearon, P. (2010). Theory of mind and attentional bias to facial emotional expressions: A preliminary study. Scandinavian Journal of Psychology. Theory of mind ability has been associated with performance in interpersonal interactions and has been found to influence aspects such as emotion recognition, social competence, and social anxiety. Being able to attribute mental states to others requires attention to subtle communication cues such as facial emotional expressions. Decoding and interpreting emotions expressed by the face, especially those with negative valence, are essential skills to successful social interaction. The current study explored the association between theory of mind skills and attentional bias to facial emotional expressions. According to the study hypothesis, individuals with poor theory of mind skills showed preferential attention to negative faces over both non‐negative faces and neutral objects. Tentative explanations for the findings are offered emphasizing the potential adaptive role of vigilance for threat as a way of allocating a limited capacity to interpret others’ mental states to obtain as much information as possible about potential danger in the social environment.  相似文献   

19.
Unconscious facial reactions to emotional facial expressions   总被引:22,自引:0,他引:22  
Studies reveal that when people are exposed to emotional facial expressions, they spontaneously react with distinct facial electromyographic (EMG) reactions in emotion-relevant facial muscles. These reactions reflect, in part, a tendency to mimic the facial stimuli. We investigated whether corresponding facial reactions can be elicited when people are unconsciously exposed to happy and angry facial expressions. Through use of the backward-masking technique, the subjects were prevented from consciously perceiving 30-ms exposures of happy, neutral, and angry target faces, which immediately were followed and masked by neutral faces. Despite the fact that exposure to happy and angry faces was unconscious, the subjects reacted with distinct facial muscle reactions that corresponded to the happy and angry stimulus faces. Our results show that both positive and negative emotional reactions can be unconsciously evoked, and particularly that important aspects of emotional face-to-face communication can occur on an unconscious level.  相似文献   

20.
This study investigated whether observers' facial reactions to the emotional facial expressions of others represent an affective or a cognitive response to these emotional expressions. Three hypotheses were contrasted: (1) facial reactions to emotional facial expressions are due to mimicry as part of an affective empathic reaction; (2) facial reactions to emotional facial expressions are a reflection of shared affect due to emotion induction; and (3) facial reactions to emotional facial expressions are determined by cognitive load depending on task difficulty. Two experiments were conducted varying type of task, presentation of stimuli, and task difficulty. The results show that depending on the nature of the rating task, facial reactions to facial expressions may be either affective or cognitive. Specifically, evidence for facial mimicry was only found when individuals made judgements regarding the valence of an emotional facial expression. Other types of judgements regarding facial expressions did not seem to elicit mimicry but may lead to facial responses related to cognitive load.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号