首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Infants' categorization of animals and vehicles based on static vs. dynamic attributes of stimuli was investigated in five experiments (N=158) using a categorization habituation-of-looking paradigm. In Experiment 1, 6-month-olds categorized static color images of animals and vehicles, and in Experiment 2, 6-month-olds categorized dynamic point-light displays showing only motions of the same animals and vehicles. In Experiments 3, 4, and 5, 6- and 9-month-olds were tested in an habituation-transfer paradigm: half of the infants at each age were habituated to static images and tested with dynamic point-light displays, and the other half were habituated to dynamic point-light displays and tested with static images. Six-month-olds did not transfer. Only 9-month-olds who were habituated to dynamic displays showed evidence of category transfer to static images. Together the findings show that 6-month-olds categorize animals and vehicles based on static and dynamic information, and 9-month-olds can transfer dynamic category information to static images. Transfer, static vs. dynamic information, and age effects in infant categorization are discussed.  相似文献   

2.
Do people always interpret a facial expression as communicating a single emotion (e.g., the anger face as only angry) or is that interpretation malleable? The current study investigated preschoolers' (N = 60; 3-4 years) and adults' (N = 20) categorization of facial expressions. On each of five trials, participants selected from an array of 10 facial expressions (an open-mouthed, high arousal expression and a closed-mouthed, low arousal expression each for happiness, sadness, anger, fear, and disgust) all those that displayed the target emotion. Children's interpretation of facial expressions was malleable: 48% of children who selected the fear, anger, sadness, and disgust faces for the "correct" category also selected these same faces for another emotion category; 47% of adults did so for the sadness and disgust faces. The emotion children and adults attribute to facial expressions is influenced by the emotion category for which they are looking. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

3.
胡治国  刘宏艳 《心理科学》2015,(5):1087-1094
正确识别面部表情对成功的社会交往有重要意义。面部表情识别受到情绪背景的影响。本文首先介绍了情绪背景对面部表情识别的增强作用,主要表现为视觉通道的情绪一致性效应和跨通道情绪整合效应;然后介绍了情绪背景对面部表情识别的阻碍作用,主要表现为情绪冲突效应和语义阻碍效应;接着介绍了情绪背景对中性和歧义面孔识别的影响,主要表现为背景的情绪诱发效应和阈下情绪启动效应;最后对现有研究进行了总结分析,提出了未来研究的建议。  相似文献   

4.
Young adults recognize other young adult faces more accurately than older adult faces, an effect termed the own‐age bias (OAB). The categorization‐individuation model (CIM) proposes that recognition memory biases like the OAB occur as unfamiliar faces are initially quickly categorized. In‐group faces are seen as socially relevant which motivates the processing of individuating facial features. Outgroup faces are processed more superficially with attention to category‐specific information which hinders subsequent recognition. To examine the roles of categorization and individuation in the context of the OAB, participants completed a face recognition task and a speeded age categorization task including young and older adult faces. In the recognition task, half of the participants were given instructions aimed to encourage individuation of other‐age faces. An OAB emerged that was not influenced by individuation instructions, but the magnitude of the OAB was correlated with performance in the categorization task. The larger the categorization advantage for older adult over young adult faces, the larger the OAB. These results support the premise that social categorization processes can affect the subsequent recognition of own‐ and other‐age faces, but do not provide evidence for the effectiveness of individuation instructions in reducing the OAB.  相似文献   

5.
The processing of several important aspects of a human face was investigated in a single patient (LZ), who had a large infarct of the right hemisphere involving the parietal, and temporal lobes with extensions into the frontal region. LZ showed selective problems with recognizing emotional expressions, whereas she was flawless in recognizing gender, familiarity, and identity. She was very poor in recognizing negative facial expressions (fear, disgust, anger, sadness), but scored as well as the controls on the positive facial expression of happiness. However, in two experiments using both static and dynamic face stimuli, we showed that LZ also did not have a proper notion of what a facial expression of happiness looks like, and could not adequately apply this label. We conclude that the proper recognition of both negative and positive facial expressions relies on the right hemisphere, and that the left hemisphere produces a default state resulting in a bias towards evaluating expressions as happy. We discuss the implications of the current findings for the main models that aim to explain hemispheric specializations for processing of positive and negative emotions.  相似文献   

6.
采用类别知觉情绪识别范式,考察高、低羞怯儿童对快乐-愤怒和快乐-悲伤模糊情绪面孔的知觉偏差和知觉敏感性。结果发现:(1)相对于低羞怯儿童,高羞怯儿童倾向于将快乐-愤怒模糊情绪面孔知觉为愤怒,将快乐-悲伤模糊情绪面孔知觉为悲伤;(2)两组儿童在快乐-愤怒、快乐-悲伤模糊情绪面孔类别界线处的斜率差异均不显著。研究表明高羞怯儿童具有敌意归因偏向和更高的悲伤共情反应,而对快乐-愤怒和快乐-悲伤表情的类别转变不敏感。  相似文献   

7.
采用类别知觉情绪识别范式,考察高、低羞怯儿童对快乐-愤怒和快乐-悲伤模糊情绪面孔的知觉偏差和知觉敏感性。结果发现:(1)相对于低羞怯儿童,高羞怯儿童倾向于将快乐-愤怒模糊情绪面孔知觉为愤怒,将快乐-悲伤模糊情绪面孔知觉为悲伤;(2)两组儿童在快乐-愤怒、快乐-悲伤模糊情绪面孔类别界线处的斜率差异均不显著。研究表明高羞怯儿童具有敌意归因偏向和更高的悲伤共情反应,而对快乐-愤怒和快乐-悲伤表情的类别转变不敏感。  相似文献   

8.
Four studies were conducted to test the hypothesis that group-related physical features may directly activate related stereotypes, leading to more stereotypic inferences over and above those resulting from categorization. As predicted, targets with more Afrocentric features were judged as more likely to have traits stereotypic of African Americans. This effect was found with judgments of African Americans and of European Americans. Furthermore, the effect was not eliminated when a more sensitive measure of categorization processes (category accessibility) was used or when the judgement context made category distinctions salient. Of additional interest was the finding that category accessibility independently affected judgment, such that targets who could be more quickly categorized as group members were judged more stereotypically.  相似文献   

9.
Recent studies measuring the facial expressions of emotion have focused primarily on the perception of frontal face images. As we frequently encounter expressive faces from different viewing angles, having a mechanism which allows invariant expression perception would be advantageous to our social interactions. Although a couple of studies have indicated comparable expression categorization accuracy across viewpoints, it is unknown how perceived expression intensity and associated gaze behaviour change across viewing angles. Differences could arise because diagnostic cues from local facial features for decoding expressions could vary with viewpoints. Here we manipulated orientation of faces (frontal, mid-profile, and profile view) displaying six common facial expressions of emotion, and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. In comparison with frontal faces, profile faces slightly reduced identification rates for disgust and sad expressions, but significantly decreased perceived intensity for all tested expressions. Although quantitatively viewpoint had expression-specific influence on the proportion of fixations directed at local facial features, the qualitative gaze distribution within facial features (e.g., the eyes tended to attract the highest proportion of fixations, followed by the nose and then the mouth region) was independent of viewpoint and expression type. Our results suggest that the viewpoint-invariant facial expression processing is categorical perception, which could be linked to a viewpoint-invariant holistic gaze strategy for extracting expressive facial cues.  相似文献   

10.
温芳芳  佐斌 《心理科学》2019,(2):395-401
作为社会认知的基本过程和重要途径,社会分类对预测刻板印象和群际感知、减少多元文化群体中的关系冲突、促进推理与决策以及指导社会关系推断等都有重要作用。人们进行社会分类的线索可以概括为明显线索和模糊线索、自然线索和社会线索以及静态线索和动态线索。社会分类会受到分类对象、情境和感知者等的作用,同时对人们认知、情绪、情感和行为等产生一系列影响。未来可以基于跨文化和发展视角探讨社会分类的线索偏向、潜在机制及立足本土文化检验社会分类的影响及干预策略。  相似文献   

11.
陈本友  黄希庭 《心理科学》2012,35(4):770-777
通过把面孔表情分割成三部分,按照不同的时间间隔以及不同的呈现时间相继呈现,考察了被试对面孔表情的时间整合效果,以此探讨时间整合的加工过程和影响因素。结果发现:(1)面孔表情的时间整合效果受时间结构和刺激材料的影响。(2)分离呈现的面孔表情能否进行时间整合与SOA的大小有关。(3)面孔表情的时间整合存在类型差异。(4)面孔表情的时间整合是在一个有限的视觉缓冲器内进行的,图像记忆和长时记忆与面孔表情的时间整合过程关系密切。  相似文献   

12.
Past research has shown that children recognize emotions from facial expressions poorly and improve only gradually with age, but the stimuli in such studies have been static faces. Because dynamic faces include more information, it may well be that children more readily recognize emotions from dynamic facial expressions. The current study of children (N = 64, aged 5–10 years old) who freely labeled the emotion conveyed by static and dynamic facial expressions found no advantage of dynamic over static expressions; in fact, reliable differences favored static expressions. An alternative explanation of gradual improvement with age is that children's emotional categories change during development from a small number of broad emotion categories to a larger number of narrower categories—a pattern found here with both static and dynamic expressions.  相似文献   

13.
Sato W  Yoshikawa S 《Cognition》2007,104(1):1-18
Based on previous neuroscientific evidence indicating activation of the mirror neuron system in response to dynamic facial actions, we hypothesized that facial mimicry would occur while subjects viewed dynamic facial expressions. To test this hypothesis, dynamic/static facial expressions of anger/happiness were presented using computer-morphing (Experiment 1) and videos (Experiment 2). The subjects' facial actions were unobtrusively videotaped and blindly coded using Facial Action Coding System [FACS; Ekman, P., & Friesen, W. V. (1978). Facial action coding system. Palo Alto, CA: Consulting Psychologist]. In the dynamic presentations common to both experiments, brow lowering, a prototypical action in angry expressions, occurred more frequently in response to angry expressions than to happy expressions. The pulling of lip corners, a prototypical action in happy expressions, occurred more frequently in response to happy expressions than to angry expressions in dynamic presentations. Additionally, the mean latency of these actions was less than 900 ms after the onset of dynamic changes in facial expression. Naive raters recognized the subjects' facial reactions as emotional expressions, with the valence corresponding to the dynamic facial expressions that the subjects were viewing. These results indicate that dynamic facial expressions elicit spontaneous and rapid facial mimicry, which functions both as a form of intra-individual processing and as inter-individual communication.  相似文献   

14.
Recent studies have demonstrated that context can dramatically influence the recognition of basic facial expressions, yet the nature of this phenomenon is largely unknown. In the present paper we begin to characterize the underlying process of face-context integration. Specifically, we examine whether it is a relatively controlled or automatic process. In Experiment 1 participants were motivated and instructed to avoid using the context while categorizing contextualized facial expression, or they were led to believe that the context was irrelevant. Nevertheless, they were unable to disregard the context, which exerted a strong effect on their emotion recognition. In Experiment 2, participants categorized contextualized facial expressions while engaged in a concurrent working memory task. Despite the load, the context exerted a strong influence on their recognition of facial expressions. These results suggest that facial expressions and their body contexts are integrated in an unintentional, uncontrollable, and relatively effortless manner.  相似文献   

15.
Relations between typical and atypical exemplars of superordinate categories are low in figurative similarity, i.e., similarity based in appearance or in spatial/temporal context. Operativity, as an emergent competence to overcome figurative cues and establish nonfigurative relations, might be expected to contribute to superordinate categorization. The present study assessed the relative consistency of age-equivalent preoperational and concrete-operational groups of first graders across two categorization tasks employing color drawings of exemplars of superordinate artifact categories. Concrete-operational subjects categorized two exemplars together on a Sample-Match Task if they had previously included both exemplars under the same category on a Category-Membership Task. In addition to membership in the same category, preoperational subjects required that both exemplars be typical before categorizing them together on the Sample-Match Task. The cognitive levels did not differ in their category membership decisions. Results are discussed in terms of both utilization and acquisition of superordinate knowledge.  相似文献   

16.
Past studies found that, for preschoolers, a story specifying a situational cause and behavioural consequence is a better cue to fear and disgust than is the facial expression of those two emotions, but the facial expressions used were static. Two studies (Study 1: N = 68, 36–68 months; Study 2: N = 72, 49–90 months) tested whether this effect could be reversed when the expressions were dynamic and included facial, postural, and vocal cues. Children freely labelled emotions in three conditions: story, still face, and dynamic expression. Story remained a better cue than still face or dynamic expression for fear and disgust and also for the later emerging emotions of embarrassment and pride.  相似文献   

17.
We investigated whether emotional information from facial expression and hand movement quality was integrated when identifying the expression of a compound stimulus showing a static facial expression combined with emotionally expressive dynamic manual actions. The emotions (happiness, neutrality, and anger) expressed by the face and hands were either congruent or incongruent. In Experiment 1, the participants judged whether the stimulus person was happy, neutral, or angry. Judgments were mainly based on the facial expressions, but were affected by manual expressions to some extent. In Experiment 2, the participants were instructed to base their judgment on the facial expression only. An effect of hand movement expressive quality was observed for happy facial expressions. The results conform with the proposal that perception of facial expressions of emotions can be affected by the expressive qualities of hand movements.  相似文献   

18.
Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX discrimination and identification tasks on morphed affective and linguistic facial expression continua. The continua were created by morphing end-point photo exemplars into 11 images, changing linearly from one expression to another in equal steps. For both affective and linguistic expressions, hearing non-signers exhibited better discrimination across category boundaries than within categories for both experiments, thus replicating previous results with affective expressions and demonstrating CP effects for non-canonical facial expressions. Deaf signers, however, showed significant CP effects only for linguistic facial expressions. Subsequent analyses indicated that order of presentation influenced signers’ response time performance for affective facial expressions: viewing linguistic facial expressions first slowed response time for affective facial expressions. We conclude that CP effects for affective facial expressions can be influenced by language experience.  相似文献   

19.
Presenting stimuli from skewed concentration distributions affects mean responses on category scales. However, if the number of categories on the response scale is increased, the degree of separation between the mean responses obtained for a positively as opposed to a negatively skewed concentration distribution diminishes. The present study investigates the effect of skewed concentration distributions upon ratings on a line scale and compares it to the context effect found for a 7-point category scale. In addition, sequential dependencies between consecutive stimuli and responses are investigated in order to assess their relevance in tasteintensity scaling studies. The context effects are similar for the 7-point category scale and for the line scale. The analyses of sequential effects show that both preceding responses and preceding stimuli affect current responses. However, since these two factors work in opposite directions, only a small contrast effect from the previous stimulus is significant in an overall analysis. The present study shows that even though the overall sequential effects between consecutive stimuli and responses are small, the effect of experimental context may be considerable. Since subjective context is established at the beginning of a session and sequential dependencies operate throughout the whole session, it is argued that contextual and sequential effects are only indirectly related.  相似文献   

20.
N L Etcoff  J J Magee 《Cognition》1992,44(3):227-240
People universally recognize facial expressions of happiness, sadness, fear, anger, disgust, and perhaps, surprise, suggesting a perceptual mechanism tuned to the facial configuration displaying each emotion. Sets of drawings were generated by computer, each consisting of a series of faces differing by constant physical amounts, running from one emotional expression to another (or from one emotional expression to a neutral face). Subjects discriminated pairs of faces, then, in a separate task, categorized the emotion displayed by each. Faces within a category were discriminated more poorly than faces in different categories that differed by an equal physical amount. Thus emotional expressions, like colors and speech sounds, are perceived categorically, not as a direct reflection of their continuous physical properties.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号