首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The ability of high-functioning individuals with autism to perceive facial expressions categorically was studied using eight facial expression continua created via morphing software. Participants completed a delayed matching task and an identification task. Like undergraduate male participants (N = 12), performance on the identification task for participants with autism (N = 15) was predicted by performance on the delayed matching task for the angry-afraid, happy-sad, and happy-surprised continua. This result indicates a clear category boundary and suggests that individuals with autism do perceive at least some facial expressions categorically. As this result is inconsistent with findings from other studies of categorical perception in individuals with autism, possible explanations for these findings are discussed.  相似文献   

2.
Categorical perception of facial expressions is studied in high-functioning adolescents with autism, using three continua of facial expressions obtained by morphing. In contrast to the results of normal adults, the performance on the identification task in autistic subjects did not predict performance on the discrimination task, an indication that autistic individuals do not perceive facial expressions categorically. Performance of autistic subjects with low social intelligence was more impaired than that of subjects with higher social IQ scores on the expression recognition of unmanipulated photographs. It is suggested that autistic subjects with higher social intelligence may use compensatory strategies that they have acquired in social training programs This may camouflage the deficits of this subgroup in the perception of facial expressions.  相似文献   

3.
Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers’ discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum “felt the same” or “felt different.” In the identification task, images were presented individually and participants were asked to label the emotion displayed on the face (e.g., “Does she look happy or sad?”). Results suggest that 3.5-year-olds have the same category boundary as adults. They were more likely to report that the image pairs felt “different” at the image pair that crossed the category boundary. These results suggest that 3.5-year-olds perceive happy and sad emotional facial expressions categorically as adults do. Categorizing emotional expressions is advantageous for children if it allows them to use social information faster and more efficiently.  相似文献   

4.
Does our perception of others' emotional signals depend on the language we speak or is our perception the same regardless of language and culture? It is well established that human emotional facial expressions are perceived categorically by viewers, but whether this is driven by perceptual or linguistic mechanisms is debated. We report an investigation into the perception of emotional facial expressions, comparing German speakers to native speakers of Yucatec Maya, a language with no lexical labels that distinguish disgust from anger. In a free naming task, speakers of German, but not Yucatec Maya, made lexical distinctions between disgust and anger. However, in a delayed match-to-sample task, both groups perceived emotional facial expressions of these and other emotions categorically. The magnitude of this effect was equivalent across the language groups, as well as across emotion continua with and without lexical distinctions. Our results show that the perception of affective signals is not driven by lexical labels, instead lending support to accounts of emotions as a set of biologically evolved mechanisms.  相似文献   

5.
The current study investigated 6-, 9- and 12-month old infants’ ability to categorically perceive facial emotional expressions depicting faces from two continua: happy–sad and happy–angry. In a between-subject design, infants were tested on their ability to discriminate faces that were between-category (across the category boundary) or within-category (within emotion category). Results suggest that 9- and 12 month-olds can discriminate between but not within categories, for the happy–angry continuum. Infants could not discriminate between cross-boundary facial expressions in the happy–sad continuum at any age. We suggest a functional account; categorical perception may develop in conjunction with the emotion's relevance to the infant.  相似文献   

6.
Can face actions that carry significance within language be perceived categorically? We used continua produced by computational morphing of face-action images to explore this question in a controlled fashion. In Experiment 1 we showed that question-type-a syntactic distinction in British Sign Language (BSL)-can be perceived categorically, but only when it is also identified as a question marker. A few hearing non-signers were sensitive to this distinction; among those who used sign, late sign learners were no less sensitive than early sign users. A very similar facial-display continuum between 'surprise' and 'puzzlement' was perceived categorically by deaf and hearing participants, irrespective of their sign experience (Experiment 2). The categorical processing of facial displays can be demonstrated for sign, but may be grounded in universally perceived distinctions between communicative face actions. Moreover, the categorical perception of facial actions is not confined to the six universal facial expressions.  相似文献   

7.
Participants in manipulated emotional states played computerised movies in which facial expressions of emotion changed into categorically different expressions. The participants' task was to detect the offset of the initial expression. An effect of emotional state was observed such that individuals in happy states saw the offset of happiness (changing into sadness) at an earlier point in the movies than did those in sad states. Similarly, sad condition participants detected the offset of a sad expression changing into a happy expression earlier than did happy condition participants. This result is consistent with a proposed role of facial mimicry in the perception of change in emotional expression. The results of a second experiment provide additional evidence for the mimicry account. The Discussion focuses on the relationship between motor behaviour and perception.  相似文献   

8.
本研究比较了社会性发展迟滞大学生与正常大学生的表情加工,探讨迟滞个体表情加工的特点及可能原因,还验证面部表情类别知觉效应。采用Morph情绪面孔实验发现:除恐惧情绪外,基本表情的强度越大被试对表情的识别越好;但迟滞个体的表情加工速度比正常个体慢,对愤怒的识别也更差;他们辨别混合表情中悲伤、愤怒的类别界线都发生偏移。迟滞个体的表情加工能力不如正常个体,并且对悲伤具有反应偏向,对愤怒存在加工缺陷。  相似文献   

9.
Emotional facial expressions are perceived categorically. Little is known about individual differences in the position of the category boundary, nor whether the category boundaries differ across stimulus continua. Similarly, little is known about whether individuals’ category boundaries are stable over time. We investigated these topics in a series of experiments designed to locate category boundaries using converging evidence from identification and discrimination tasks. We compared both across individuals and within individuals across two sessions that spanned a week. Results show differences between individuals in the location of category boundaries, and suggest that these differences are stable over time. We also found differences in boundary location when we compared images depicting different models.  相似文献   

10.
探讨了个体攻击性对愤怒表情加工中反应偏向和敏感性的影响。使用愤怒、恐惧原型生成表情连续体作为实验材料, 采用类别知觉实验范式考察了高、低攻击个体识别和辨别愤怒-恐惧连续体的类别转折点和斜率。结果发现, 与低攻击个体相比, 高攻击个体识别愤怒-恐惧连续体类别界线处的曲线斜率更大; 高攻击个体具有类别界线向恐惧一端偏移的倾向, 但并没有达到统计显著。这表明, 高攻击个体不存在敌意归因偏向, 而是对愤怒和恐惧表情的转变具有更高的敏感性。  相似文献   

11.
Participants (N = 216) were administered a differential implicit learning task during which they were trained and tested on 3 maximally distinct 2nd-order visuomotor sequences, with sequence color serving as discriminative stimulus. During training, 1 sequence each was followed by an emotional face, a neutral face, and no face, using backward masking. Emotion (joy, surprise, anger), face gender, and exposure duration (12 ms, 209 ms) were varied between participants; implicit motives were assessed with a picture-story exercise. For power-motivated individuals, low-dominance facial expressions enhanced and high-dominance expressions impaired learning. For affiliation-motivated individuals, learning was impaired in the context of hostile faces. These findings did not depend on explicit learning of fixed sequences or on awareness of sequence-face contingencies.  相似文献   

12.
Social deficits are one of the most striking manifestations of autism spectrum disorders (ASDs). Among these social deficits, the recognition and understanding of emotional facial expressions has been widely reported to be affected in ASDs. We investigated emotional face processing in children with and without autism using event-related potentials (ERPs). High-functioning children with autism (n = 15, mean age = 10.5 ± 3.3 years) completed an implicit emotional task while visual ERPs were recorded. Two groups of typically developing children (chronological age-matched and verbal equivalent age-matched [both ns = 15, mean age = 7.7 ± 3.8 years]) also participated in this study. The early ERP responses to faces (P1 and N170) were delayed, and the P1 was smaller in children with autism than in typically developing children of the same chronological age, revealing that the first stages of emotional face processing are affected in autism. However, when matched by verbal equivalent age, only P1 amplitude remained affected in autism. Our results suggest that the emotional and facial processing difficulties in autism could start from atypicalities in visual perceptual processes involving rapid feedback to primary visual areas and subsequent holistic processing.  相似文献   

13.
Volitional attentional control has been found to rely on prefrontal neuronal circuits. According to the attentional control theory of anxiety, impairment in the volitional control of attention is a prominent feature in anxiety disorders. The present study investigated this assumption in socially anxious individuals using an emotional saccade task with facial expressions (happy, angry, fearful, sad, neutral). The gaze behavior of participants was recorded during the emotional saccade task, in which participants performed either pro- or antisaccades in response to peripherally presented facial expressions. The results show that socially anxious persons have difficulties in inhibiting themselves to reflexively attend to facial expressions: They made more erratic prosaccades to all facial expressions when an antisaccade was required. Thus, these findings indicate impaired attentional control in social anxiety. Overall, the present study shows a deficit of socially anxious individuals in attentional control—for example, in inhibiting the reflexive orienting to neutral as well as to emotional facial expressions. This result may be due to a dysfunction in the prefrontal areas being involved in attentional control.  相似文献   

14.
前人研究表明他人的社会地位影响个体对其面部表情的加工,但是他人地位如何影响个体对其面部表情加工的神经机制并不清楚。本研究首先让被试完成时间估计任务,然后让评价者(高VS低地位)根据被试任务成绩给予面部表情(愉悦、中性和愤怒)反馈,同时记录被试加工评价者面部表情的ERP成分。脑电结果发现,在早期的P1阶段,当评价者给予愤怒和愉悦表情作为反馈评价刺激时,低地位评价者比高地位评价者诱发了更强的P1波幅;在N2加工阶段,只有当评价者给予愉悦表情作为反馈评价刺激时,高地位评价者比低地位评价者诱发的N2波幅更强;在后期P3加工阶段,只有当评价者给予愉悦表情作为反馈评价刺激时,低地位评价者比高地位评价者诱发了更强的P3波幅。当前研究结果说明评价者的地位不仅影响了个体对其面部表情加工的早期阶段,而且还影响了个体对其面部表情加工的后期阶段。  相似文献   

15.
Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX discrimination and identification tasks on morphed affective and linguistic facial expression continua. The continua were created by morphing end-point photo exemplars into 11 images, changing linearly from one expression to another in equal steps. For both affective and linguistic expressions, hearing non-signers exhibited better discrimination across category boundaries than within categories for both experiments, thus replicating previous results with affective expressions and demonstrating CP effects for non-canonical facial expressions. Deaf signers, however, showed significant CP effects only for linguistic facial expressions. Subsequent analyses indicated that order of presentation influenced signers’ response time performance for affective facial expressions: viewing linguistic facial expressions first slowed response time for affective facial expressions. We conclude that CP effects for affective facial expressions can be influenced by language experience.  相似文献   

16.
We tested whether socially anxious individuals perform better in processing facial information with low spatial frequencies (LSFs). For this, we presented socially anxious and nonanxious participants with hybrid face stimuli that contained independent facial expressions in high (HSF) and LSF bands. In two tasks, participants either rated the images according to "angriness" or had to learn how hybrid facial expressions predicted the location of an upcoming target. We found mostly additive effects of LSF and HSF information in the rating task for both groups. In contrast, socially anxious participants showed better prediction performance for LSF expressions in the implicit learning task. We conclude that socially anxious participants are more sensitive to facial information within LSFs, but this higher sensitivity may become mostly evident in indirect tasks. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

17.
Relatively few studies have examined memory bias for social stimuli in depression or dysphoria. The aim of this study was to investigate the influence of depressive symptoms on memory for facial information. A total of 234 participants completed the Beck Depression Inventory II and a task examining memory for facial identity and expression of happy and sad faces. For both facial identity and expression, the recollective experience was measured with the Remember/Know/Guess procedure (Gardiner & Richardson-Klavehn, 2000). The results show no major association between depressive symptoms and memory for identities. However, dysphoric individuals consciously recalled (Remember responses) more sad facial expressions than non-dysphoric individuals. These findings suggest that sad facial expressions led to more elaborate encoding, and thereby better recollection, in dysphoric individuals.  相似文献   

18.
Relatively few studies have examined memory bias for social stimuli in depression or dysphoria. The aim of this study was to investigate the influence of depressive symptoms on memory for facial information. A total of 234 participants completed the Beck Depression Inventory II and a task examining memory for facial identity and expression of happy and sad faces. For both facial identity and expression, the recollective experience was measured with the Remember/Know/Guess procedure (Gardiner & Richardson-Klavehn, 2000). The results show no major association between depressive symptoms and memory for identities. However, dysphoric individuals consciously recalled (Remember responses) more sad facial expressions than non-dysphoric individuals. These findings suggest that sad facial expressions led to more elaborate encoding, and thereby better recollection, in dysphoric individuals.  相似文献   

19.
Doi H  Kato A  Hashimoto A  Masataka N 《Perception》2008,37(9):1399-1411
Data on the development of the perception of facial biological motion during preschool years are disproportionately scarce. We investigated the ability of preschoolers to recognise happy, angry, and surprised expressions, and eye-closing facial movements on the basis of facial biological motion. Children aged 4 years (n = 18) and 5-6 years (n = 19), and adults (n = 17) participated in a matching task, in which they were required to match the point-light displays of facial expressions to prototypic schematic images of facial expressions and facial movement. The results revealed that the ability to recognise facial expressions from biological motion emerges as early as the age of 4 years. This ability was evident for happy expressions at the age of 4 years; 5-6-year-olds reliably recognised surprised as well as happy expressions. The theoretical significances of these findings are discussed.  相似文献   

20.
Spontaneous mimicry, including that of emotional facial expressions, is important for socio‐emotional skills such as empathy and communication. Those skills are often impacted in autism spectrum disorders (ASD). Successful mimicry requires not only the activation of the response, but also its appropriate speed. Yet, previous studies examined ASD differences in only response magnitude. The current study investigated timing and magnitude of spontaneous and voluntary mimicry in ASD children and matched controls using facial electromyography (EMG). First, participants viewed and recognized happy, sad, fear, anger, disgust and neutral expressions presented at different durations. Later, participants voluntarily mimicked the expressions. There were no group differences on emotion recognition and amplitude of expression‐appropriate EMG activity. However, ASD participants’ spontaneous, but not voluntary, mimicry activity was delayed by about 160 ms. This delay occurred across different expressions and presentation durations. We relate these findings to the literature on mirroring and temporal dynamics of social interaction.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号