首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The purpose of the present study was to see if 7-10-year-old socially anxious children (n = 26) made systematic errors in identifying and sending emotions in facial expressions, paralanguage, and postures as compared with the more random errors of children who were inattentive-hyperactive (n = 21). It was found that socially anxious children made more errors in identifying anger and fear in children's facial expressions and anger in adults' postures and in expressing anger in their own facial expressions than did their inattentive-hyperactive peers. Results suggest that there may be systematic difficulties specifically in visual nonverbal emotion communication that contribute to the personal and social difficulties socially anxious children experience.  相似文献   

2.
The current study investigated 6-, 9- and 12-month old infants’ ability to categorically perceive facial emotional expressions depicting faces from two continua: happy–sad and happy–angry. In a between-subject design, infants were tested on their ability to discriminate faces that were between-category (across the category boundary) or within-category (within emotion category). Results suggest that 9- and 12 month-olds can discriminate between but not within categories, for the happy–angry continuum. Infants could not discriminate between cross-boundary facial expressions in the happy–sad continuum at any age. We suggest a functional account; categorical perception may develop in conjunction with the emotion's relevance to the infant.  相似文献   

3.
Hosie  J. A.  Gray  C. D.  Russell  P. A.  Scott  C.  Hunter  N. 《Motivation and emotion》1998,22(4):293-313
This paper reports the results of three tasks comparing the development of the understanding of facial expressions of emotion in deaf and hearing children. Two groups of hearing and deaf children of elementary school age were tested for their ability to match photographs of facial expressions of emotion, and to produce and comprehend emotion labels for the expressions of happiness, sadness, anger, fear, disgust, and surprise. Accuracy data showed comparable levels of performance for deaf and hearing children of the same age. Happiness and sadness were the most accurately matched expressions and the most accurately produced and comprehended labels. Anger was the least accurately matched expression and the most poorly comprehended emotion label. Disgust was the least accurately labeled expression; however, deaf children were more accurate at labeling this expression, and also at labeling fear, than hearing children. Error data revealed that children confused anger with disgust, and fear with surprise. However, the younger groups of deaf and hearing children also showed a tendency to confuse the negative expressions of anger, disgust, and fear with sadness. The results suggest that, despite possible differences in the early socialisation of emotion, deaf and hearing children share a common understanding of the emotions conveyed by distinctive facial expressions.  相似文献   

4.
Do people always interpret a facial expression as communicating a single emotion (e.g., the anger face as only angry) or is that interpretation malleable? The current study investigated preschoolers' (N = 60; 3-4 years) and adults' (N = 20) categorization of facial expressions. On each of five trials, participants selected from an array of 10 facial expressions (an open-mouthed, high arousal expression and a closed-mouthed, low arousal expression each for happiness, sadness, anger, fear, and disgust) all those that displayed the target emotion. Children's interpretation of facial expressions was malleable: 48% of children who selected the fear, anger, sadness, and disgust faces for the "correct" category also selected these same faces for another emotion category; 47% of adults did so for the sadness and disgust faces. The emotion children and adults attribute to facial expressions is influenced by the emotion category for which they are looking. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

5.
Studies on adults have revealed a disadvantageous effect of negative emotional stimuli on executive functions (EF), and it is suggested that this effect is amplified in children. The present study’s aim was to assess how emotional facial expressions affected working memory in 9- to 12-year-olds, using a working memory task with emotional facial expressions as stimuli. Additionally, we explored how degree of internalizing and externalizing symptoms in typically developing children was related to performance on the same task. Before employing the working memory task with emotional facial expressions as stimuli, an independent sample of 9- to 12-year-olds was asked to recognize the facial expressions intended to serve as stimuli for the working memory task and to rate the facial expressions on the degree to which the emotion was expressed and for arousal to obtain a baseline for how children during this age recognize and react to facial expressions. The first study revealed that children rated the facial expressions with similar intensity and arousal across age. When employing the working memory task with facial expressions, results revealed that negatively valenced expressions impaired working memory more than neutral and positively valenced expressions. The ability to successfully complete the working memory task increased between 9 to 12 years of age. Children’s total problems were associated with poorer performance on the working memory task with facial expressions. Results on the effect of emotion on working memory are discussed in light of recent models and empirical findings on how emotional information might interact and interfere with cognitive processes such as working memory.  相似文献   

6.
The authors evaluated an intervention program developed to remediate children's deficits in reading emotions in facial expressions. Thirty children from 2 elementary schools in suburban Atlanta participated in 6 30-min sessions over 4 weeks in which they were taught to discriminate, identify, express, and apply facial expression cues. The ability to read emotion in facial expressions significantly improved for the intervention group compared with the control group. Improvement on identifying facial expressions was associated with increased feelings of lower social anxiety and higher self-worth for girls. Boys' self-concept was negatively related to improvement. On the basis of the results, the authors suggested that structured interventions like the present one could be used to improve students' nonverbal processing abilities within public school settings, but with some cautions regarding the impact of new learning for boys.  相似文献   

7.
The authors evaluated an intervention program developed to remediate children's deficits in reading emotions in facial expressions. Thirty children from 2 elementary schools in suburban Atlanta participated in 6 30-min sessions over 4 weeks in which they were taught to discriminate, identify, express, and apply facial expression cues. The ability to read emotion in facial expressions significantly improved for the intervention group compared with the control group. Improvement on identifying facial expressions was associated with increased feelings of lower social anxiety and higher self-worth for girls. Boys' self-concept was negatively related to improvement. On the basis of the results, the authors suggested that structured interventions like the present one could be used to improve students' nonverbal processing abilities within public school settings, but with some cautions regarding the impact of new learning for boys.  相似文献   

8.
The authors aimed to examine the possible association between (a) accurately reading emotion in facial expressions and (b) social and academic competence among elementary school-aged children. Participants were 840 7-year-old children who completed a test of the ability to read emotion in facial expressions. Teachers rated children's social and academic behavior using behavioral rating scales. The authors found that children who had more difficulty identifying emotion in faces also were more likely to have more problems overall and, more specifically, with peer relationships among boys and with learning difficulties among girls. Findings suggest that nonverbal receptive skill plays a significant role in children's social and academic adjustment.  相似文献   

9.
This study investigates the discrimination accuracy of emotional stimuli in subjects with major depression compared with healthy controls using photographs of facial expressions of varying emotional intensities. The sample included 88 unmedicated male and female subjects, aged 18-56 years, with major depressive disorder (n = 44) or no psychiatric illness (n = 44), who judged the emotion of 200 facial pictures displaying an expression between 10% (90% neutral) and 80% (nuanced) emotion. Stimuli were presented in 10% increments to generate a range of intensities, each presented for a 500-ms duration. Compared with healthy volunteers, depressed subjects showed very good recognition accuracy for sad faces but impaired recognition accuracy for other emotions (e.g., harsh, surprise, and sad expressions) of subtle emotional intensity. Recognition accuracy improved for both groups as a function of increased intensity on all emotions. Finally, as depressive symptoms increased, recognition accuracy increased for sad faces, but decreased for surprised faces. Moreover, depressed subjects showed an impaired ability to accurately identify subtle facial expressions, indicating that depressive symptoms influence accuracy of emotional recognition.  相似文献   

10.
We tested the capacity to perceive visual expressions of emotion, and to use those expressions as guides to social decisions, in three groups of 8‐ to 10‐year‐old Romanian children: children abandoned to institutions then randomly assigned to remain in ‘care as usual’ (institutional care); children abandoned to institutions then randomly assigned to a foster care intervention; and community children who had never been institutionalized. Experiment 1 examined children's recognition of happy, sad, fearful, and angry facial expressions that varied in intensity. Children assigned to institutional care had higher thresholds for identifying happy expressions than foster care or community children, but did not differ in their thresholds for identifying the other facial expressions. Moreover, the error rates of the three groups of children were the same for all of the facial expressions. Experiment 2 examined children's ability to use facial expressions of emotion to guide social decisions about whom to befriend and whom to help. Children assigned to institutional care were less accurate than foster care or community children at deciding whom to befriend; however, the groups did not differ in their ability to decide whom to help. Overall, although there were group differences in some abilities, all three groups of children performed well across tasks. The results are discussed in the context of theoretical accounts of the development of emotion processing.  相似文献   

11.
Doi H  Kato A  Hashimoto A  Masataka N 《Perception》2008,37(9):1399-1411
Data on the development of the perception of facial biological motion during preschool years are disproportionately scarce. We investigated the ability of preschoolers to recognise happy, angry, and surprised expressions, and eye-closing facial movements on the basis of facial biological motion. Children aged 4 years (n = 18) and 5-6 years (n = 19), and adults (n = 17) participated in a matching task, in which they were required to match the point-light displays of facial expressions to prototypic schematic images of facial expressions and facial movement. The results revealed that the ability to recognise facial expressions from biological motion emerges as early as the age of 4 years. This ability was evident for happy expressions at the age of 4 years; 5-6-year-olds reliably recognised surprised as well as happy expressions. The theoretical significances of these findings are discussed.  相似文献   

12.
This study examined whether African American children's ability to identify emotion in the facial expressions and tones of voice of European American stimuli was comparable to their European American peers and related to personality, social competence, and achievement. The Diagnostic Analysis of Nonverbal Accuracy (DANVA ; Nowicki & Duke, 1994) was administered to 84 African American children. It was found that they performed less accurately on adult and child tones of voice and adult facial expressions. Further, girls' ability to read emotion in tones of voice was related to better social competence and achievement, whereas boys' ability to identify emotion in adult tones of voice was related to teacher-rated social competence. Results suggest that more research is needed with ethnic groups to clarify the impact of nonverbal processing skills on social and achievement outcomes.  相似文献   

13.
Individual differences in young children's frustration responses set the stage for myriad developmental outcomes and represent an area of intense empirical interest. Emotion regulation is hypothesized to comprise the interplay of complex behaviors, such as facial expressions, and activation of concurrent underlying neural systems. At present, however, the literature has mostly examined children's observed emotion regulation behaviors and assumed underlying brain activation through separate investigations, resulting in theoretical gaps in our understanding of how children regulate emotion in vivo. Our goal was to elucidate links between young children's emotion regulation‐related neural activation, facial muscular movements, and parent‐rated temperamental emotion regulation. Sixty‐five children (age 3–7) completed a frustration‐inducing computer task while lateral prefrontal cortex (LPFC) activation and concurrent facial expressions were recorded. Negative facial expressions with eye constriction were inversely associated with both parent‐rated temperamental emotion regulation and concurrent LPFC activation. Moreover, we found evidence that positive expressions with eye constriction during frustration may be associated with stronger LPFC activation. Results suggest a correspondence between facial expressions and LPFC activation that may explicate how children regulate emotion in real time.  相似文献   

14.
Children are often surrounded by other humans and companion animals (e.g., dogs, cats); and understanding facial expressions in all these social partners may be critical to successful social interactions. In an eye-tracking study, we examined how children (4–10 years old) view and label facial expressions in adult humans and dogs. We found that children looked more at dogs than humans, and more at negative than positive or neutral human expressions. Their viewing patterns (Proportion of Viewing Time, PVT) at individual facial regions were also modified by the viewed species and emotion, with the eyes not always being most viewed: this related to positive anticipation when viewing humans, whilst when viewing dogs, the mouth was viewed more or equally compared to the eyes for all emotions. We further found that children's labelling (Emotion Categorisation Accuracy, ECA) was better for the perceived valence than for emotion category, with positive human expressions easier than both positive and negative dog expressions. They performed poorly when asked to freely label facial expressions, but performed better for human than dog expressions. Finally, we found some effects of age, sex, and other factors (e.g., experience with dogs) on both PVT and ECA. Our study shows that children have a different gaze pattern and identification accuracy compared to adults, for viewing faces of human adults and dogs. We suggest that for recognising human (own-face-type) expressions, familiarity obtained through casual social interactions may be sufficient; but for recognising dog (other-face-type) expressions, explicit training may be required to develop competence.

Highlights

  • We conducted an eye-tracking experiment to investigate how children view and categorise facial expressions in adult humans and dogs
  • Children's viewing patterns were significantly dependent upon the facial region, species, and emotion viewed
  • Children's categorisation also varied with the species and emotion viewed, with better performance for valence than emotion categories
  • Own-face-types (adult humans) are easier than other-face-types (dogs) for children, and casual familiarity (e.g., through family dogs) to the latter is not enough to achieve perceptual competence
  相似文献   

15.
The development of children's ability to identify facial emotional expressions has long been suggested to be experience dependent, with parental caregiving as an important influencing factor. This study attempts to further this knowledge by examining disorganization of the attachment system as a potential psychological mechanism behind aberrant caregiving experiences and deviations in the ability to identify facial emotional expressions. Typically developing children (= 105, 49.5% boys) aged 6–7 years (= 6 years 8 months, SD = 1.8 months) completed an attachment representation task and an emotion identification task, and parents rated children's negative emotionality. The results showed a generally diminished ability in disorganized children to identify facial emotional expressions, but no response biases. Disorganized attachment was also related to higher levels of negative emotionality, but discrimination of emotional expressions did not moderate or mediate this relation. Our novel findings relate disorganized attachment to deviations in emotion identification, and therefore suggest that disorganization of the attachment system may constitute a psychological mechanism linking aberrant caregiving experiences to deviations in children's ability to identify facial emotional expressions. Our findings further suggest that deviations in emotion identification in disorganized children, in the absence of maltreatment, may manifest in a generally diminished ability to identify emotional expressions, rather than in specific response biases.  相似文献   

16.
This study examined whether African American children's ability to identify emotion in the facial expressions and tones of voice of European American stimuli was comparable to their European American peers and related to personality, social competence, and achievement. The Diagnostic Analysis of Nonverbal Accuracy (DANVA; Now-icki & Duke, 1994) was administered to 84 African American children. It was found that they performed less accurately on adult and child tones of voice and adult facial expressions. Further, girls' ability to read emotion in tones of voice was related to better social competence and achievement, whereas boys' ability to identify emotion in adult tones of voice was related to teacher-rated social competence. Results suggest that more research is needed with ethnic groups to clarify the impact of nonverbal processing skills on social and achievement outcomes.  相似文献   

17.
The important ability to discriminate facial expressions of emotion develops early in human ontogeny. In the present study, 7-month-old infants’ event-related potentials (ERPs) in response to angry and fearful emotional expressions were measured. The angry face evoked a larger negative component (Nc) at fronto-central leads between 300 and 600 ms after stimulus onset when compared to the amplitude of the Nc to the fearful face. Furthermore, over posterior channels, the angry expression elicited a N290 that was larger in amplitude and a P400 that was smaller in amplitude than for the fearful expression. This is the first study that shows that the ability of infants to discriminate angry and fearful facial expressions can be measured at the electrophysiological level. These data suggest that 7-month-olds allocated more attentional resources to the angry face as indexed by the Nc. Implications of this result may be that the social signal values were perceived differentially, not merely as “negative”. Furthermore, it is possible that the angry expression might have been more arousing and discomforting for the infant compared with the fearful expression.  相似文献   

18.
This study investigated how target sex, target age, and expressive ambiguity influence emotion perception. Undergraduate participants (N = 192) watched morphed video clips of eight child and eight adult facial expressions shifting from neutral to either sadness or anger. Participants were asked to stop the video clip when they first saw an emotion appear (perceptual sensitivity) and were asked to identify the emotion that they saw (accuracy). Results indicate that female participants identified sad expressions sooner in female targets than in male targets. Participants were also more accurate identifying angry facial expressions by male children than by female children. Findings are discussed in terms of the effects of ambiguity, gender, and age on the perception of emotional expressions.  相似文献   

19.
The present study investigated emotion recognition accuracy and its relation to social adjustment in 7-10 year-old children. The ability to recognize basic emotions from facial and vocal expressions was measured and compared to peer popularity and to teacher-rated social competence. The results showed that emotion recognition was related to these measures of social adjustment, but the gender of a child and emotion category affected this relationship. Emotion recognition accuracy was significantly related to social adjustment for the girls, but not for the boys. For the girls, especially the recognition of surprise was related to social adjustment. Together, these results suggest that the ability to recognize others' emotional states from nonverbal cues is an important socio-cognitive ability for school-aged girls.  相似文献   

20.
There is substantial evidence to suggest that deafness is associated with delays in emotion understanding, which has been attributed to delays in language acquisition and opportunities to converse. However, studies addressing the ability to recognise facial expressions of emotion have produced equivocal findings. The two experiments presented here attempt to clarify emotion recognition in deaf children by considering two aspects: the role of motion and the role of intensity in deaf children’s emotion recognition. In Study 1, 26 deaf children were compared to 26 age-matched hearing controls on a computerised facial emotion recognition task involving static and dynamic expressions of 6 emotions. Eighteen of the deaf and 18 age-matched hearing controls additionally took part in Study 2, involving the presentation of the same 6 emotions at varying intensities. Study 1 showed that deaf children’s emotion recognition was better in the dynamic rather than static condition, whereas the hearing children showed no difference in performance between the two conditions. In Study 2, the deaf children performed no differently from the hearing controls, showing improved recognition rates with increasing rates of intensity. With the exception of disgust, no differences in individual emotions were found. These findings highlight the importance of using ecologically valid stimuli to assess emotion recognition.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号