首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This study investigated children’s perceptual ability to process second-order facial relations. In total, 78 children in three age groups (7, 9, and 11 years) and 28 adults were asked to say whether the eyes were the same distance apart in two side-by-side faces. The two faces were similar on all points except the space between the eyes, which was either the same or different, with various degrees of difference. The results showed that the smallest eye spacing children were able to discriminate decreased with age. This ability was sensitive to face orientation (upright or upside-down), and this inversion effect increased with age. It is concluded here that, despite early sensitivity to configural/holistic information, the perceptual ability to process second-order relations in faces improves with age and constrains the development of the face recognition ability.  相似文献   

2.
This investigation examined whether impairment in configural processing could explain deficits in face emotion recognition in people with Parkinson’s disease (PD). Stimuli from the Radboud Faces Database were used to compare recognition of four negative emotion expressions by older adults with PD (n = 16) and matched controls (n = 17). Participants were tasked with categorizing emotional expressions from upright and inverted whole faces and facial composites; it is difficult to derive configural information from these two types of stimuli so featural processing should play a larger than usual role in accurate recognition of emotional expressions. We found that the PD group were impaired relative to controls in recognizing anger, disgust and fearful expressions in upright faces. Then, consistent with a configural processing deficit, participants with PD showed no composite effect when attempting to identify facial expressions of anger, disgust and fear. A face inversion effect, however, was observed in the performance of all participants in both the whole faces and facial composites tasks. These findings can be explained in terms of a configural processing deficit if it is assumed that the disruption caused by facial composites was specific to configural processing, whereas inversion reduced performance by making it difficult to derive both featural and configural information from faces.  相似文献   

3.
The aim of the present study was to contribute to the literature on the ability to recognize anger, happiness, fear, surprise, sadness, disgust, and neutral emotions from facial information (whole face, eye region, mouth region). More specifically, the aim was to investigate older adults' performance in emotions recognition using the same tool used in the previous studies on children and adults’ performance and verify if the pattern of emotions recognition show differences compared with the other two groups. Results showed that happiness is among the easiest emotions to recognize while the disgust is always among the most difficult emotions to recognize for older adults. The findings seem to indicate that is more easily recognizing emotions when pictures represent the whole face; compared with the specific region (eye and mouth regions), older participants seems to recognize more easily emotions when the mouth region is presented. In general, the results of the study did not detect a decay in the ability to recognize emotions from the face, eyes, or mouth. The performance of the old adults is statistically worse than the other two groups in only a few cases: in anger and disgust recognition from the whole face; in anger recognition from the eye region; and in disgust, fear, and neutral emotion recognition from mouth region.  相似文献   

4.
Several studies investigated the role of featural and configural information when processing facial identity. A lot less is known about their contribution to emotion recognition. In this study, we addressed this issue by inducing either a featural or a configural processing strategy (Experiment 1) and by investigating the attentional strategies in response to emotional expressions (Experiment 2). In Experiment 1, participants identified emotional expressions in faces that were presented in three different versions (intact, blurred, and scrambled) and in two orientations (upright and inverted). Blurred faces contain mainly configural information, and scrambled faces contain mainly featural information. Inversion is known to selectively hinder configural processing. Analyses of the discriminability measure (A′) and response times (RTs) revealed that configural processing plays a more prominent role in expression recognition than featural processing, but their relative contribution varies depending on the emotion. In Experiment 2, we qualified these differences between emotions by investigating the relative importance of specific features by means of eye movements. Participants had to match intact expressions with the emotional cues that preceded the stimulus. The analysis of eye movements confirmed that the recognition of different emotions rely on different types of information. While the mouth is important for the detection of happiness and fear, the eyes are more relevant for anger, fear, and sadness.  相似文献   

5.
Abstract

Adult face recognition is severely hampered by stimulus inversion. Several investigators have attributed this vulnerability to the effect of orientation on encoding relational aspects of faces. Previous work has also demonstrated that children are less sensitive to orientation of faces than are adults. This has been interpreted as reflecting an increasing reliance on configural aspects of faces with increasing age and expertise.

Young, Hellawell, and Hay (1987) demonstrated that for adults the encoding of relations among facial parts is, indeed, sensitive to orientation. When chimeric faces are upright, the top half of one face fuses with the bottom half of the other, making the person depicted in the top half difficult to recognize. This effect (the composite effect) is not seen when the faces are inverted. The present study obtained the composite effect for 6-year-old and 10-year-old children, just as for adults. The composite effect was found to an equal degree at all ages tested and was seen both in tasks involving highly familiar faces and in those involving newly learned, previously unfamiliar faces. Thus, these data provided no support for the hypothesis of increasing reliance on configural aspects of faces with increasing age, at least in the sense tapped by this procedure.

However, the data did confirm an Age X Orientation interaction. In recognizing both familiar and previously unfamiliar faces, 6-year-olds were less affected by inversion than were 10-year-olds, who, in turn, were less affected than were adults. Increasing vulnerability to inversion of faces with age was independent of the composite effect. Apparently, there are two distinct sources to the large effect of inversion that characterizes adult face encoding: one seen throughout development and one acquired only with expertise.  相似文献   

6.
When faces are turned upside down, recognition is known to be severely disrupted. This effect is thought to be due to disruption of configural processing. Recently, Leder and Bruce (2000, Quarterly Journal of Experimental Psychology A 53 513-536) argued that configural information in face processing consists at least partly of locally processed relations between facial elements. In three experiments we investigated whether a local relational feature (the interocular distance) is processed differently in upside-down versus upright faces. In experiment 1 participants decided in which of two sequentially presented photographic faces the interocular distance was larger. The decision was more difficult in upside-down presentation. Three different conditions were used in experiment 2 to investigate whether this deficit depends upon parts of the face beyond the eyes themselves; displays showed the eye region alone, the eyes and nose, or the eyes and nose and mouth. The availability of additional features did not interact with the inversion effect which was observed strongly even when the eyes were shown in isolation. In experiment 3 all eyes were turned upside down in the inverted face condition as in the Thatcher illusion (Thompson, 1980 Perception 9 483-484). In this case no inversion effect was found. These results are in accordance with an explanation of the face-inversion effect in which the disruption of configural facial information plays the critical role in memory for faces, and in which configural information corresponds to spatial information that is processed in a way which is sensitive to local properties of the facial features involved.  相似文献   

7.
Picture-plane inversion leads to qualitative changes of face perception   总被引:2,自引:0,他引:2  
Rossion B 《Acta psychologica》2008,128(2):274-289
Presenting a face stimulus upside-down generally causes a larger deficit in perceiving metric distances between facial features ("configuration") than local properties of these features. This effect supports a qualitative account of face inversion: the same transformation affects the processing of different kinds of information differently. However, this view has been recently challenged by studies reporting equal inversion costs of performance for discriminating featural and configural manipulations on faces. In this paper I argue that these studies did not replicate previous results due to methodological factors rather than largely irrelevant parameters such as having equal performance for configural and featural conditions at upright orientation, or randomizing trials across conditions. I also argue that identifying similar diagnostic features (eyes and eyebrows) for discriminating individual faces at upright and inverted orientations by means of response classification methods does not dismiss at all the qualitative view of face inversion. Considering these elements as well as both behavioral and neuropsychological evidence, I propose that the generally larger effect of inversion for processing configural than featural cues is a mere consequence of the disruption of holistic face perception. That is, configural relations necessarily involve two or more distant features on the face, such that their perception is most dependent on the ability to perceive simultaneously multiple features of a face as a whole.  相似文献   

8.
Past research has shown that children recognize emotions from facial expressions poorly and improve only gradually with age, but the stimuli in such studies have been static faces. Because dynamic faces include more information, it may well be that children more readily recognize emotions from dynamic facial expressions. The current study of children (N = 64, aged 5–10 years old) who freely labeled the emotion conveyed by static and dynamic facial expressions found no advantage of dynamic over static expressions; in fact, reliable differences favored static expressions. An alternative explanation of gradual improvement with age is that children's emotional categories change during development from a small number of broad emotion categories to a larger number of narrower categories—a pattern found here with both static and dynamic expressions.  相似文献   

9.
The face inversion effect (FIE) is a reduction in recognition performance for inverted faces (compared to upright faces) that is greater than that typically observed with other stimulus types (e.g., houses). The work of Diamond and Carey, suggests that a special type of configural information, “second-order relational information” is critical in generating this inversion effect. However, Tanaka and Farah concluded that greater reliance on second-order relational information did not directly result in greater sensitivity to inversion, and they suggested that the FIE is not entirely due to a reliance on this type of configural information. A more recent review by McKone and Yovel provides a meta-analysis that makes a similar point. In this paper, we investigated the contributions made by configural and featural information to the FIE. Experiments 1a and1b investigated the link between configural information and the FIE. Remarkably, Experiment 1b showed that disruption of all configural information of the type considered in Diamond and Carey's analysis (both first and second order) was effective in reducing recognition performance, but did not significantly impact on the FIE. Experiments 2 and 3 revealed that face processing is affected by the orientation of individual features and that this plays a major role in producing the FIE. The FIE was only completely eliminated when we disrupted the single feature orientation information in addition to the configural information, by using a new type of transformation similar to Thatcherizing our sets of scrambled faces. We conclude by noting that our results for scrambled faces are consistent with an account that has recognition performance entirely determined by the proportion of upright facial features within a stimulus, and that any ability to make use of the spatial configuration of these features seems to benefit upright and inverted normal faces alike.  相似文献   

10.
长期以来,关于面孔表情识别的研究主要是围绕着面孔本身的结构特征来进行的,但是近年来的研究发现,面孔表情的识别也会受到其所在的情境背景(如语言文字、身体背景、自然与社会场景等)的影响,特别是在识别表情相似的面孔时,情境对面孔表情识别的影响更大。本文首先介绍和分析了近几年关于语言文字、身体动作、自然场景和社会场景等情境影响个体对面孔表情的识别的有关研究;其次,又分析了文化背景、年龄以及焦虑程度等因素对面孔表情识别情境效应的影响;最后,强调了未来的研究应重视研究儿童被试群体、拓展情绪的类别、关注真实生活中的面孔情绪感知等。  相似文献   

11.
Studies on adults have revealed a disadvantageous effect of negative emotional stimuli on executive functions (EF), and it is suggested that this effect is amplified in children. The present study’s aim was to assess how emotional facial expressions affected working memory in 9- to 12-year-olds, using a working memory task with emotional facial expressions as stimuli. Additionally, we explored how degree of internalizing and externalizing symptoms in typically developing children was related to performance on the same task. Before employing the working memory task with emotional facial expressions as stimuli, an independent sample of 9- to 12-year-olds was asked to recognize the facial expressions intended to serve as stimuli for the working memory task and to rate the facial expressions on the degree to which the emotion was expressed and for arousal to obtain a baseline for how children during this age recognize and react to facial expressions. The first study revealed that children rated the facial expressions with similar intensity and arousal across age. When employing the working memory task with facial expressions, results revealed that negatively valenced expressions impaired working memory more than neutral and positively valenced expressions. The ability to successfully complete the working memory task increased between 9 to 12 years of age. Children’s total problems were associated with poorer performance on the working memory task with facial expressions. Results on the effect of emotion on working memory are discussed in light of recent models and empirical findings on how emotional information might interact and interfere with cognitive processes such as working memory.  相似文献   

12.
Research suggests that infants progress from discrimination to recognition of emotions in faces during the first half year of life. It is unknown whether the perception of emotions from bodies develops in a similar manner. In the current study, when presented with happy and angry body videos and voices, 5-month-olds looked longer at the matching video when they were presented upright but not when they were inverted. In contrast, 3.5-month-olds failed to match even with upright videos. Thus, 5-month-olds but not 3.5-month-olds exhibited evidence of recognition of emotions from bodies by demonstrating intermodal matching. In a subsequent experiment, younger infants did discriminate between body emotion videos but failed to exhibit an inversion effect, suggesting that discrimination may be based on low-level stimulus features. These results document a developmental change from discrimination based on non-emotional information at 3.5 months to recognition of body emotions at 5 months. This pattern of development is similar to face emotion knowledge development and suggests that both the face and body emotion perception systems develop rapidly during the first half year of life.  相似文献   

13.
The present study investigated emotion recognition accuracy and its relation to social adjustment in 7-10 year-old children. The ability to recognize basic emotions from facial and vocal expressions was measured and compared to peer popularity and to teacher-rated social competence. The results showed that emotion recognition was related to these measures of social adjustment, but the gender of a child and emotion category affected this relationship. Emotion recognition accuracy was significantly related to social adjustment for the girls, but not for the boys. For the girls, especially the recognition of surprise was related to social adjustment. Together, these results suggest that the ability to recognize others' emotional states from nonverbal cues is an important socio-cognitive ability for school-aged girls.  相似文献   

14.
The ability to recognize emotions from others’ nonverbal behavior (emotion recognition ability, ERA) is crucial to successful social functioning. However, currently no self-administered ERA training for non-clinical adults covering multiple sensory channels exists. We conducted four studies in a lifespan sample of participants in the laboratory and online (total N?=?531) to examine the effectiveness of a short computer-based training for 14 different emotions using audiovisual clips of emotional expressions. Results showed that overall, young and middle-aged participants that had received the training scored significantly higher on facial, vocal, and audiovisual emotion recognition than the control groups. The training effect for audiovisual ERA persisted over 4 weeks. In older adults (59–90 years), however, the training had no effect. The new, brief training could be useful in applied settings such as professional training, at least for younger and middle-aged adults. In older adults, improving ERA might require a longer and more interactive intervention.  相似文献   

15.
The authors investigated children's ability to recognize emotions from the information available in the lower, middle, or upper face. School-age children were shown partial or complete facial expressions and asked to say whether they corresponded to a given emotion (anger, fear, surprise, or disgust). The results indicate that 5-year-olds were able to recognize fear, anger, and surprise from partial facial expressions. Fear was better recognized from the information located in the upper face than those located in the lower face. A similar pattern of results was found for anger, but only in girls. Recognition improved between 5 and 10 years old for surprise and anger, but not for fear and disgust.  相似文献   

16.
Our facial expressions give others the opportunity to access our feelings, and constitute an important nonverbal tool for communication. Many recent studies have investigated emotional perception in adults, and our knowledge of neural processes involved in emotions is increasingly precise. Young children also use faces to express their internal states and perceive emotions in others, but little is known about the neurodevelopment of expression recognition. The goal of the current study was to determine the normal development of facial emotion perception. We recorded ERPs in 82 children 4 to 15 years of age during an implicit processing task with emotional faces. Task and stimuli were the same as those used and validated in an adult study; we focused on the components that showed sensitivity to emotions in adults (P1, N170 and frontal slow wave). An effect of the emotion expressed by faces was seen on the P1 in the youngest children. With increasing age this effect disappeared while an emotional sensitivity emerged on N170. Early emotional processing in young children differed from that observed in the adolescents, who approached adults. In contrast, the later frontal slow wave, although showing typical age effects, was more positive for neutral and happy faces across age groups. Thus, despite the precocious utilization of facial emotions, the neural processing involved in the perception of emotional faces develops in a staggered fashion throughout childhood, with the adult pattern appearing only late in adolescence.  相似文献   

17.
The present study investigated the effect of the perception of faces expressing shame on time perception in children aged 5 and 8 years, as well as in adults, as a function of their ability to recognize this emotional expression. The participants' ability to recognize the expression of shame among faces expressing different emotions was tested. They were then asked to perform a temporal bisection task involving both neutral and ashamed faces. The results showed that, from the age of 8 years, the participants who recognized the facial expressions of shame underestimated their presentation time compared to that of neutral faces. In contrast, no time distortion was observed in the children who did not recognize the ashamed faces or in those younger children who did recognize them. The results are discussed in terms of self-conscious emotions which develop to involve an attentional mechanism.  相似文献   

18.
The ability to recognize and label emotional facial expressions is an important aspect of social cognition. However, existing paradigms to examine this ability present only static facial expressions, suffer from ceiling effects or have limited or no norms. A computerized test, the Emotion Recognition Task (ERT), was developed to overcome these difficulties. In this study, we examined the effects of age, sex, and intellectual ability on emotion perception using the ERT. In this test, emotional facial expressions are presented as morphs gradually expressing one of the six basic emotions from neutral to four levels of intensity (40%, 60%, 80%, and 100%). The task was administered in 373 healthy participants aged 8–75. In children aged 8–17, only small developmental effects were found for the emotions anger and happiness, in contrast to adults who showed age‐related decline on anger, fear, happiness, and sadness. Sex differences were present predominantly in the adult participants. IQ only minimally affected the perception of disgust in the children, while years of education were correlated with all emotions but surprise and disgust in the adult participants. A regression‐based approach was adopted to present age‐ and education‐ or IQ‐adjusted normative data for use in clinical practice. Previous studies using the ERT have demonstrated selective impairments on specific emotions in a variety of psychiatric, neurologic, or neurodegenerative patient groups, making the ERT a valuable addition to existing paradigms for the assessment of emotion perception.  相似文献   

19.
High levels of trait hostility are associated with wide-ranging interpersonal deficits and heightened physiological response to social stressors. These deficits may be attributable in part to individual differences in the perception of social cues. The present study evaluated the ability to recognize facial emotion among 48 high hostile (HH) and 48 low hostile (LH) smokers and whether experimentally-manipulated acute nicotine deprivation moderated relations between hostility and facial emotion recognition. A computer program presented series of pictures of faces that morphed from a neutral emotion into increasing intensities of happiness, sadness, fear, or anger, and participants were asked to identify the emotion displayed as quickly as possible. Results indicated that HH smokers, relative to LH smokers, required a significantly greater intensity of emotion expression to recognize happiness. No differences were found for other emotions across HH and LH individuals, nor did nicotine deprivation moderate relations between hostility and emotion recognition. This is the first study to show that HH individuals are slower to recognize happy facial expressions and that this occurs regardless of recent tobacco abstinence. Difficulty recognizing happiness in others may impact the degree to which HH individuals are able to identify social approach signals and to receive social reinforcement.  相似文献   

20.
The authors sought to contribute to the literature on the ability to recognize anger, happiness, fear, surprise, sadness, disgust, and neutral emotions from facial information. They aimed to investigate if—regardless of age—this pattern changes. More specifically, the present study aimed to compare the difference between the performance of adults and 6- to 7-year-old children in detecting emotions from the whole face and a specific face region, namely the eyes and mouth. The findings seem to indicate that, for both groups, recognizing disgust, happiness, and surprise is facilitated when pictures represent the whole face. However, with regard to a specific region, a prevalence for children was not found between the eyes and mouth. Meanwhile, for adults, would seem to detect a greater role of the eye region. Finally, regarding the differences in the performance of emotions recognition, adults are better only in a few cases, whereas children are better in recognizing anger from the mouth.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号