首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
The metaphoric expression ‘bright smile’ may reflect the actual judgment of facial lightness under varying emotional expressions. The present research examined whether people in fact judge smiling faces as perceptually brighter than frowning faces. Four studies demonstrated that participants believed smiling faces were brighter compared to frowning faces in a binary choice task and in an absolute judgment task. The results suggest that emotional expressions (i.e., smiles and frowns) can bias judgments of facial brightness in ways consistent with the metaphor. Among other implications, such results suggest that stereotypes about darker-skinned individuals may be attenuated by smiles.  相似文献   

2.
外显和内隐的面孔审美加工的脑成像研究表明,美的面孔比不美的面孔导致眶额皮层、伏隔核、杏仁核等奖赏脑区更大的激活。脑电研究则发现了与面孔审美加工有关的早期负波和晚期正成分。面孔审美加工有关的脑区活动也受到性别、生理周期等个体因素的调节。未来的研究需要比较美的面孔与其它奖赏刺激加工的神经机制,探讨面孔审美加工的不同阶段及时间进程,在面孔知觉的框架下形成合理的面孔审美加工的神经模型。  相似文献   

3.
笑容是人类最普遍、最频繁的表情。人类进化出伪装笑容的能力,也拥有部分识别伪装的能力。在表情的表达与识别上,动态信息起着重要的作用。一方面,笑容表达的动态特征可能为区分真伪笑容提供重要的信息,所以我们拟借助近年发展的计算机视觉的特征提取技术,系统地量化分析真伪笑容的动态特征(时长、方向、速度、流畅性、运动对称性、不同部位同步性、头动模式等),考察笑容在不同伪装方式及不同情境下的区别与一致性,深入理解人类笑容表达的特点。另一方面,通过探索有效动态特征与正确识别率的关系,检验知觉-注意假说,了解真伪笑容的识别特点及研究识别机制。通过比较动态真伪笑容的表达特点与识别特点,进一步理解人类表情信号编码与解码之间的关系。  相似文献   

4.
The objectives of this study were to propose a method of presenting dynamic facial expressions to experimental subjects, in order to investigate human perception of avatar's facial expressions of different levels of emotional intensity. The investigation concerned how perception varies according to the strength of facial expression, as well as according to an avatar's gender. To accomplish these goals, we generated a male and a female virtual avatar with five levels of intensity of happiness and anger using a morphing technique. We then recruited 16 normal healthy subjects and measured each subject's emotional reaction by scoring affective arousal and valence after showing them the avatar's face. Through this study, we were able to investigate human perceptual characteristics evoked by male and female avatars' graduated facial expressions of happiness and anger. In addition, we were able to identify that a virtual avatar's facial expression could affect human emotion in different ways according to the avatar's gender and the intensity of its facial expressions. However, we could also see that virtual faces have some limitations because they are not real, so subjects recognized the expressions well, but were not influenced to the same extent. Although a virtual avatar has some limitations in conveying its emotion using facial expressions, this study is significant in that it shows that a new potential exists to use or manipulate emotional intensity by controlling a virtual avatar's facial expression linearly using a morphing technique. Therefore, it is predicted that this technique may be used for assessing emotional characteristics of humans, and may be of particular benefit for work with people with emotional disorders through a presentation of dynamic expression of various emotional intensities.  相似文献   

5.
6.
A substantial body of research has established that even when we are not consciously aware of the faces of others we are nevertheless sensitive to, and impacted by their facial expression. In this paper, we consider this body of research from a new perspective by examining the functions of unconscious perception revealed by these studies. A consideration of the literature from this perspective highlights that existing research methods are limited when it comes to revealing possible functions of unconscious perception. The critical shortcoming is that in all of the methods, the perceived facial expression remains outside of awareness. This is a problem because there are good reasons to believe that one important function of unconsciously perceived negative faces is to attract attention so that they are consciously perceived; such conscious perception, however, is never allowed with existing methodologies. We discuss recent studies of emotional face perception under conditions of visual search that address this issue directly. Further, we suggest that methodologies that do not examine cognitive processes as they occur in more natural settings may result in fundamental misunderstandings of human cognition.  相似文献   

7.
Most theories of social influence do not consider adult development. Theoretical and empirical work in life span developmental psychology, however, suggests that age may reduce susceptibility to social influence. The present study examined age differences in social conformity for 2 classes of stimuli: judgments of geometric shapes and emotional facial expressions. As predicted, older people, compared with their younger counterparts, displayed lower rates of social conformity, and this age difference was most evident when judging emotional facial expressions.  相似文献   

8.
Laughter is an auditory stimulus that powerfully conveys positive emotion. We investigated how laughter influenced the visual perception of facial expressions. We presented a sound clip of laughter simultaneously with a happy, a neutral, or a sad schematic face. The emotional face was briefly presented either alone or among a crowd of neutral faces. We used a matching method to determine how laughter influenced the perceived intensity of the happy, neutral, and sad expressions. For a single face, laughter increased the perceived intensity of a happy expression. Surprisingly, for a crowd of faces, laughter produced an opposite effect, increasing the perceived intensity of a sad expression in a crowd. A follow-up experiment revealed that this contrast effect may have occurred because laughter made the neutral distractor faces appear slightly happy, thereby making the deviant sad expression stand out in contrast. A control experiment ruled out semantic mediation of the laughter effects. Our demonstration of the strong context dependence of laughter effects on facial expression perception encourages a reexamination of the previously demonstrated effects of prosody, speech content, and mood on face perception, as they may be similarly context dependent.  相似文献   

9.
Contribution of color to face recognition   总被引:2,自引:0,他引:2  
Yip AW  Sinha P 《Perception》2002,31(8):995-1003
One of the key challenges in face perception lies in determining how different facial attributes contribute to judgments of identity. In this study, we focus on the role of color cues. Although color appears to be a salient attribute of faces, past research has suggested that it confers little recognition advantage for identifying people. Here we report experimental results suggesting that color cues do play a role in face recognition and their contribution becomes evident when shape cues are degraded. Under such conditions, recognition performance with color images is significantly better than that with gray-scale images. Our experimental results also indicate that the contribution of color may lie not so much in providing diagnostic cues to identity as in aiding low-level image-analysis processes such as segmentation.  相似文献   

10.
Van Der Zant  T.  Reid  J.  Mondloch  C. J.  Nelson  N. L. 《Motivation and emotion》2021,45(5):641-648

Perceptions of others’ traits (e.g., trustworthiness or dominance) are influenced by the emotion displayed on their face. For instance, the same individual appears more trustworthy when they express happiness than when they express anger. This overextension of emotional expressions has been shown with facial expression but whether this phenomenon also occurs when viewing postural expressions was unknown. We sought to examine how expressive behaviour of the body would influence judgements of traits and how sensitivity to this cue develops. In the context of a storybook, adults (N?=?35) and children (5 to 8 years old; N?=?60) selected one of two partners to help face a challenge. The challenges required either a trustworthy or dominant partner. Participants chose between a partner with an emotional (happy/angry) face and neutral body or one with a neutral face and emotional body. As predicted, happy facial expressions were preferred over neutral ones when selecting a trustworthy partner and angry postural expressions were preferred over neutral ones when selecting a dominant partner. Children’s performance was not adult-like on most tasks. The results demonstrate that emotional postural expressions can also influence judgments of others’ traits, but that postural influence on trait judgments develops throughout childhood.

  相似文献   

11.
In a sample of 325 college students, we examined how context influences judgments of facial expressions of emotion, using a newly developed facial affect recognition task in which emotional faces are superimposed upon emotional and neutral contexts. This research used a larger sample size than previous studies, included more emotions, varied the intensity level of the expressed emotion to avoid potential ceiling effects from very easy recognition, did not explicitly direct attention to the context, and aimed to understand how recognition is influenced by non-facial information, both situationally-relevant and situationally-irrelevant. Both accuracy and RT varied as a function of context. For all facial expressions of emotion other than happiness, accuracy increased when the emotion of the face and context matched, and decreased when they mismatched. For all emotions, participants responded faster when the emotion of the face and image matched and slower when they mismatched. Results suggest that the judgment of the facial expression is itself influenced by the contextual information instead of both being judged independently and then combined. Additionally, the results have implications for developing models of facial affect recognition and indicate that there are factors other than the face that can influence facial affect recognition judgments.  相似文献   

12.
We studied gaze perception in three infant chimpanzees (Pan troglodytes), aged 10-32 weeks, using a two-choice preferential-looking paradigm. The infants were presented with two photographs of a human face: (a) with the eyes open or closed, and (b) with a direct or an averted gaze. We found that the chimpanzees preferred looking at the direct-gaze face. However, in the context of scrambled faces, the infants showed no difference in gaze discrimination between direct and averted gazes. These findings suggest that gaze perception by chimpanzees may be influenced by the surrounding facial context. The relationship between gaze perception, face processing, and the adaptive significance of gaze perception are discussed from an evolutionary perspective.  相似文献   

13.
Humans have developed a specific capacity to rapidly perceive and anticipate other people’s facial expressions so as to get an immediate impression of their emotional state of mind. We carried out two experiments to examine the perceptual and memory dynamics of facial expressions of pain. In the first experiment, we investigated how people estimate other people’s levels of pain based on the perception of various dynamic facial expressions; these differ both in terms of the amount and intensity of activated action units. A second experiment used a representational momentum (RM) paradigm to study the emotional anticipation (memory bias) elicited by the same facial expressions of pain studied in Experiment 1. Our results highlighted the relationship between the level of perceived pain (in Experiment 1) and the direction and magnitude of memory bias (in Experiment 2): When perceived pain increases, the memory bias tends to be reduced (if positive) and ultimately becomes negative. Dynamic facial expressions of pain may reenact an “immediate perceptual history” in the perceiver before leading to an emotional anticipation of the agent’s upcoming state. Thus, a subtle facial expression of pain (i.e., a low contraction around the eyes) that leads to a significant positive anticipation can be considered an adaptive process—one through which we can swiftly and involuntarily detect other people’s pain.  相似文献   

14.
Positive and Negative: Infant Facial Expressions and Emotions   总被引:4,自引:0,他引:4  
One path to understanding emotional processes and their development is the investigation of early facial expressions. Converging evidence suggests that although all infant smiles index positive emotion, some smiles are more positive than others. The evidence stems both from the situations in which infants produce different facial expressions and from naive observers' ratings of the emotional intensity of the expressions. The observers' ratings also suggest that similar facial actions—such as cheek raising—lead smiles to be perceived as more positive and lead negative expressions (cry-faces) to be perceived as more negative. One explanation for this parsimony is that certain facial actions are associated with the intensification of both positive and negative emotions.  相似文献   

15.
A smile is visually highly salient and grabs attention automatically. We investigated how extrafoveally seen smiles influence the viewers' perception of non-happy eyes in a face. A smiling mouth appeared in composite faces with incongruent non-happy (fearful, neutral, etc.) eyes, thus producing blended expressions, or it appeared in intact faces with genuine expressions. Attention to the eye region was spatially cued while foveal vision of the mouth was blocked by gaze-contingent masking. Participants judged whether the eyes were happy or not. Results indicated that the smile biased the evaluation of the eye expression: The same non-happy eyes were more likely to be judged as happy and categorized more slowly as not happy in a face with a smiling mouth than in a face with a non-smiling mouth or with no mouth. This bias occurred when the mouth and the eyes appeared simultaneously and aligned, but also to some extent when they were misaligned and when the mouth appeared after the eyes. We conclude that the highly salient smile projects to other facial regions, thus influencing the perception of the eye expression. Projection serves spatial and temporal integration of face parts and changes.  相似文献   

16.
Observers make a range of social evaluations based on facial appearance, including judgments of trustworthiness, warmth, competence, and other aspects of personality. What visual information do people use to make these judgments? While links have been made between perceived social characteristics and other high-level properties of facial appearance (e.g., attractiveness, masculinity), there has been comparatively little effort to link social evaluations to low-level visual features, like spatial frequency and orientation sub-bands, known to be critically important for face processing. We explored the extent to which different social evaluations depended critically on horizontal orientation energy vs. vertical orientation energy, as is the case for face identification and emotion recognition. We found that while trustworthiness judgments exhibited this bias for horizontal orientations, competence and dominance did not, suggesting that social evaluations may depend on a multi-channel representation of facial appearance at early stages of visual processing.  相似文献   

17.
Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain–behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = –.51) and memory (r = –.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.  相似文献   

18.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression.  相似文献   

19.
Self-other merging can arise not only between acquainted people but also between strangers. To date, the factors determining self-other merging between strangers remain to be elucidated. We investigate whether strangers’ facial appearance (i.e. gaze direction) modulates such initial processes of self-other merging. In the two experiments, participants viewed strangers’ faces whose gaze either directed to or averted from them. The extent of self-other merging was measured in terms of perception of face resemblance, Inclusion of the Other in the Self (IOS) scale, and correlations of personality judgments. We found that direct gaze blurred the self-other boundaries at both facial and conceptual levels. Participants felt a stranger who directly gazed at them to be closer and more similar to themselves about face and personality.  相似文献   

20.
This experiment examines how emotion is perceived by using facial and vocal cues of a speaker. Three levels of facial affect were presented using a computer-generated face. Three levels of vocal affect were obtained by recording the voice of a male amateur actor who spoke a semantically neutral word in different simulated emotional states. These two independent variables were presented to subjects in all possible permutations—visual cues alone, vocal cues alone, and visual and vocal cues together—which gave a total set of 15 stimuli. The subjects were asked to judge the emotion of the stimuli in a two-alternative forced choice task (either HAPPY or ANGRY). The results indicate that subjects evaluate and integrate information from both modalities to perceive emotion. The influence of one modality was greater to the extent that the other was ambiguous (neutral). The fuzzy logical model of perception (FLMP) fit the judgments significantly better than an additive model, which weakens theories based on an additive combination of modalities, categorical perception, and influence from only a single modality.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号