首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Of the neurobiological models of children's and adolescents' depression, the neuropsychological one is considered here. Experimental and clinical evidence has allowed us to identify a lateralization of emotional functions from the very beginning of development, and a right hemisphere dominance for emotions is by now well-known. Many studies have also correlated depression with a right hemisphere dysfunction in patients of different ages. The aim of our study was to analyze recognition of different facial emotions by a group of depressed children and adolescents. Patients affected by Major Depressive Disorder recognized less fear in six fundamental emotions than a group of healthy controls, and Dysthymic subjects recognized less anger. The group of patients' failure to recognize negative-aroused facial expressions could indicate a subtle right hemisphere dysfunction in depressed children and adolescents.  相似文献   

2.
Typical adults mimic facial expressions within 1000 ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study evaluated mechanisms underlying RFRs during childhood and examined possible impairment in children with ASD. Experiment 1 found RFRs to happy and angry faces (not fear faces) in 15 typically developing children from 7 to 12 years of age. RFRs of fear (not anger) in response to angry faces indicated an emotional mechanism. In 11 children (8-13 years of age) with ASD, Experiment 2 found undifferentiated RFRs to fear expressions and no consistent RFRs to happy or angry faces. However, as children with ASD aged, matching RFRs to happy faces increased significantly, suggesting the development of processes underlying matching RFRs during this period in ASD.  相似文献   

3.
The development of children's ability to identify facial emotional expressions has long been suggested to be experience dependent, with parental caregiving as an important influencing factor. This study attempts to further this knowledge by examining disorganization of the attachment system as a potential psychological mechanism behind aberrant caregiving experiences and deviations in the ability to identify facial emotional expressions. Typically developing children (= 105, 49.5% boys) aged 6–7 years (= 6 years 8 months, SD = 1.8 months) completed an attachment representation task and an emotion identification task, and parents rated children's negative emotionality. The results showed a generally diminished ability in disorganized children to identify facial emotional expressions, but no response biases. Disorganized attachment was also related to higher levels of negative emotionality, but discrimination of emotional expressions did not moderate or mediate this relation. Our novel findings relate disorganized attachment to deviations in emotion identification, and therefore suggest that disorganization of the attachment system may constitute a psychological mechanism linking aberrant caregiving experiences to deviations in children's ability to identify facial emotional expressions. Our findings further suggest that deviations in emotion identification in disorganized children, in the absence of maltreatment, may manifest in a generally diminished ability to identify emotional expressions, rather than in specific response biases.  相似文献   

4.
The present study investigated whether dysphoric individuals have a difficulty in disengaging attention from negative stimuli and/or reduced attention to positive information. Sad, neutral and happy facial stimuli were presented in an attention-shifting task to 18 dysphoric and 18 control participants. Reaction times to neutral shapes (squares and diamonds) and the event-related potentials to emotional faces were recorded. Dysphoric individuals did not show impaired attentional disengagement from sad faces or facilitated disengagement from happy faces. Right occipital lateralisation of P100 was absent in dysphoric individuals, possibly indicating reduced attention-related sensory facilitation for faces. Frontal P200 was largest for sad faces among dysphoric individuals, whereas controls showed larger amplitude to both sad and happy as compared with neutral expressions, suggesting that dysphoric individuals deployed early attention to sad, but not happy, expressions. Importantly, the results were obtained controlling for the participants' trait anxiety. We conclude that at least under some circumstances the presence of depressive symptoms can modulate early, automatic stages of emotional processing.  相似文献   

5.
The present study aims to explore the influence of facial emotional expressions on pre-scholars' identity recognition was analyzed using a two-alternative forced-choice matching task. A decrement was observed in children's performance with emotional faces compared with neutral faces, both when a happy emotional expression remained unchanged between the target face and the test faces and when the expression changed from happy to neutral or from neutral to happy between the target and the test faces (Experiment 1). Negative emotional expressions (i.e. fear and anger) also interfered with children's identity recognition (Experiment 2). Obtained evidence suggests that in preschool-age children, facial emotional expressions are processed in interaction with, rather than independently from, the encoding of facial identity information. The results are discussed in relationship with relevant research conducted with adults and children.  相似文献   

6.
Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers’ discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum “felt the same” or “felt different.” In the identification task, images were presented individually and participants were asked to label the emotion displayed on the face (e.g., “Does she look happy or sad?”). Results suggest that 3.5-year-olds have the same category boundary as adults. They were more likely to report that the image pairs felt “different” at the image pair that crossed the category boundary. These results suggest that 3.5-year-olds perceive happy and sad emotional facial expressions categorically as adults do. Categorizing emotional expressions is advantageous for children if it allows them to use social information faster and more efficiently.  相似文献   

7.
The current longitudinal study (N = 107) examined mothers’ facial emotion recognition using reaction time and their infants’ affect-based attention at 5, 7, and 14 months of age using eyetracking. Our results, examining maternal and infant responses to angry, fearful and happy facial expressions, show that only maternal responses to angry facial expressions were robustly and positively linked across time points, indexing a consistent trait-like response to social threat among mothers. However, neither maternal responses to happy or fearful facial expressions nor infant responses to all three facial emotions show such consistency, pointing to the changeable nature of facial emotion processing, especially among infants. In general, infants’ attention toward negative emotions (i.e., angry and fear) at earlier timepoints was linked to their affect-biased attention for these emotions at 14 months but showed greater dynamic change across time. Moreover, our results provide limited evidence for developmental continuity in processing negative emotions and for the bidirectional interplay of infant affect-biased attention and maternal facial emotion recognition. This pattern of findings suggests that infants’ affect-biased attention to facial expressions of emotion are characterized by dynamic changes.  相似文献   

8.
A small body of research suggests that socially anxious individuals show biases in interpreting the facial expressions of others. The current study included a clinically anxious sample in a speeded emotional card-sorting task in two conditions (baseline and threat) to investigate several hypothesized biases in interpretation. Following the threat manipulation, participants with generalized social anxiety disorders (GSADs) sorted angry cards with greater accuracy, but also evidenced a greater rate of neutral cards misclassified as angry, as compared to nonanxious controls. The controls showed the opposite pattern, sorting neutral cards with greater accuracy but also misclassifying a greater proportion of angry cards as neutral, as compared to GSADs. These effects were accounted for primarily by low-intensity angry cards. Results are consistent with previous studies showing a negative interpretive bias, and can be applied to the improvement of clinical interventions.  相似文献   

9.
Two studies investigated the importance of dynamic temporal characteristic information in facilitating the recognition of subtle expressions of emotion. In Experiment 1 there were three conditions, dynamic moving sequences that showed the expression emerging from neutral to a subtle emotion, a dynamic presentation containing nine static stills from the dynamic moving sequences (ran together to encapsulate a moving sequence) and a First–Last condition containing only the first (neutral) and last (subtle emotion) stills. The results showed recognition was significantly better for the dynamic moving sequences than both the Dynamic-9 and First–Last conditions. Experiments 2a and 2b then changed the dynamics of the moving sequences by speeding up, slowing down or disrupting the rhythm of the motion sequences. These manipulations significantly reduced recognition, and it was concluded that in addition to the perception of change, recognition is facilitated by the characteristic muscular movements associated with the portrayal of each emotion.  相似文献   

10.
The object of the present study was to investigate whether 20 educable mentally retarded (EMR) children matched for verbal mental age on the Peabody Picture Vocabulary Test (PPVT; Dunn, 1965) with 20 nonretarded (NR) controls were adept in identifying emotional facial expressions and producing the equivalent word adjectives. I also reassessed the relationship between the phase of identifying facial expressions (i.e., happy, sad, angry, and scared) and the phase of producing emotional word adjectives, consisting of 16 emotional linguistic constructions (4 short stories for each of the adjectives) between higher in verbal ability educable mentally retarded (HEMR) children and higher in verbal ability nonretarded controls (HNRC) and between low in verbal ability educable mentally retarded (LEMR) children and low in verbal ability nonretarded controls (LNRC). There were no significant differences between EMR and NR children in general, despite the fact that EMR children had deficits in receptive linguistic competence.  相似文献   

11.
12.
Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.  相似文献   

13.
The Approach–Avoidance Task (AAT) was employed to indirectly investigate avoidance reactions to stimuli of potential social threat. Forty-three highly socially anxious individuals (HSAs) and 43 non-anxious controls (NACs) reacted to pictures of emotional facial expressions (angry, neutral, or smiling) or to control pictures (puzzles) by pulling a joystick towards themselves (approach) versus pushing it away from themselves (avoidance). HSAs showed stronger avoidance tendencies than NACs for smiling as well as angry faces, whereas no group differences were found for neutral faces and puzzles. In contrast, valence ratings of the emotional facial expressions did not differ between groups. A critical discrepancy between direct and indirect measures was observed for smiling faces: HSAs evaluated them positively, but reacted to them with avoidance.  相似文献   

14.
In this paper, the role of self-reported anxiety and degree of conscious awareness as determinants of the selective processing of affective facial expressions is investigated. In two experiments, an attentional bias toward fearful facial expressions was observed, although this bias was apparent only for those reporting high levels of trait anxiety and only when the emotional face was presented in the left visual field. This pattern was especially strong when the participants were unaware of the presence of the facial stimuli. In Experiment 3, a patient with right-hemisphere brain damage and visual extinction was presented with photographs of faces and fruits on unilateral and bilateral trials. On bilateral trials, it was found that faces produced less extinction than did fruits. Moreover, faces portraying a fearful or a happy expression tended to produce less extinction than did neutral expressions. This suggests that emotional facial expressions may be less dependent on attention to achieve awareness. The implications of these results for understanding the relations between attention, emotion, and anxiety are discussed.  相似文献   

15.
A rapid response to a threatening face in a crowd is important to successfully interact in social environments. Visual search tasks have been employed to determine whether there is a processing advantage for detecting an angry face in a crowd, compared to a happy face. The empirical findings supporting the “anger superiority effect” (ASE), however, have been criticized on the basis of possible low-level visual confounds and because of the limited ecological validity of the stimuli. Moreover, a “happiness superiority effect” is usually found with more realistic stimuli. In the present study, we tested the ASE by using dynamic (and static) images of realistic human faces, with validated emotional expressions having similar intensities, after controlling the bottom-up visual saliency and the amount of image motion. In five experiments, we found strong evidence for an ASE when using dynamic displays of facial expressions, but not when the emotions were expressed by static face images.  相似文献   

16.
Ribeiro, L. A. & Fearon, P. (2010). Theory of mind and attentional bias to facial emotional expressions: A preliminary study. Scandinavian Journal of Psychology. Theory of mind ability has been associated with performance in interpersonal interactions and has been found to influence aspects such as emotion recognition, social competence, and social anxiety. Being able to attribute mental states to others requires attention to subtle communication cues such as facial emotional expressions. Decoding and interpreting emotions expressed by the face, especially those with negative valence, are essential skills to successful social interaction. The current study explored the association between theory of mind skills and attentional bias to facial emotional expressions. According to the study hypothesis, individuals with poor theory of mind skills showed preferential attention to negative faces over both non‐negative faces and neutral objects. Tentative explanations for the findings are offered emphasizing the potential adaptive role of vigilance for threat as a way of allocating a limited capacity to interpret others’ mental states to obtain as much information as possible about potential danger in the social environment.  相似文献   

17.
The recognition of nonverbal emotional signals and the integration of multimodal emotional information are essential for successful social communication among humans of any age. Whereas prior studies of age dependency in the recognition of emotion often focused on either the prosodic or the facial aspect of nonverbal signals, our purpose was to create a more naturalistic setting by presenting dynamic stimuli under three experimental conditions: auditory, visual, and audiovisual. Eighty-four healthy participants (women = 44, men = 40; age range 20-70 years) were tested for their abilities to recognize emotions either mono- or bimodally on the basis of emotional (happy, alluring, angry, disgusted) and neutral nonverbal stimuli from voice and face. Additionally, we assessed visual and auditory acuity, working memory, verbal intelligence, and emotional intelligence to explore potential explanatory effects of these population parameters on the relationship between age and emotion recognition. Applying unbiased hit rates as performance measure, we analyzed data with linear regression analyses, t tests, and with mediation analyses. We found a linear, age-related decrease in emotion recognition independent of stimulus modality and emotional category. In contrast, the improvement in recognition rates associated with audiovisual integration of bimodal stimuli seems to be maintained over the life span. The reduction in emotion recognition ability at an older age could not be sufficiently explained by age-related decreases in hearing, vision, working memory, and verbal intelligence. These findings suggest alterations in social perception at a level of complexity beyond basic perceptional and cognitive abilities.  相似文献   

18.
Former research demonstrated that depression is associated with dysfunctional attentional processing of emotional information. Most studies examined this bias by registration of response latencies. The present study employed an ecologically valid measurement of attentive processing, using eye-movement registration. Dysphoric and non-dysphoric participants viewed slides presenting sad, angry, happy and neutral facial expressions. For each type of expression, three components of visual attention were analysed: the relative fixation frequency, fixation time and glance duration. Attentional biases were also investigated for inverted facial expressions to ensure that they were not related to eye-catching facial features. Results indicated that non-dysphoric individuals were characterised by longer fixating and dwelling on happy faces. Dysphoric individuals demonstrated a longer dwelling on sad and neutral faces. These results were not found for inverted facial expressions. The present findings are in line with the assumption that depression is associated with a prolonged attentional elaboration on negative information.  相似文献   

19.
This study identified components of attentional bias (e.g. attentional vigilance, attentional avoidance and difficulty with disengagement) that are critical characteristics of survivors of dating violence (DV). Eye movements were recorded to obtain accurate and continuous information regarding attention. DV survivors with high post-traumatic stress symptoms (DV-High PTSS group; n = 20) and low post-traumatic stress symptoms (DV-Low PTSS group; n = 22) and participants who had never experienced DV (NDV group; n = 21) were shown screens displaying emotional (angry, fearful and happy) faces paired with neutral faces and negative (angry and fearful) faces paired with happy faces for 10 s. The results indicate that the DV-High PTSS group spent longer dwelling on angry faces over time compared with the DV-Low PTSS and NDV groups. This result implies that the DV-High PTSS group focused on specific trauma-related stimuli but does not provide evidence of an attentional bias towards threatening stimuli in general.  相似文献   

20.
Previous studies (Lanzetta & Orr, 1980, 1981; Orr & Lanzetta, 1980) have demonstrated that fear facial expressions have the functional properties of conditioned excitatory stimuli, while happy expressions behave as conditioned inhibitors of emotional responses. The present study uses a summation conditioning procedure to distinguish between associative and nonassociative (selective sensitizations, attentional) interpretations of these findings. A neutral tone was first established as a conditioned excitatory CS by reinforcing tone presentations with shock. In subsequent nonreinforced test trials the excitatory tone was paired with either fear, happy, or neutral facial expressions. A tone alone and a tone/nonface slide compound were used as controls. The results indicate that phasic and tonic skin conductance responses to the tone/fear expression compound were significantly larger during extinction than for all other experimental and control groups. No significant differences were found among these latter conditions. The findings support the assumption that the excitatory characteristics of fear expressions do not depend on associative mechanisms. In the presence of fear cues, fear facial expressions intensify the emotional reaction and disrupt extinction of a previously acquired fear response. Happy facial expressions however, do not function as conditioned inhibitors in the absence of reinforcement, suggesting that the previously found inhibition was associative in nature.This research was supported by NSF grant No. 77-08926 and by funds from the Lincoln Filene Endowment to Dartmouth College.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号