首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
In this paper, the role of self-reported anxiety and degree of conscious awareness as determinants of the selective processing of affective facial expressions is investigated. In two experiments, an attentional bias toward fearful facial expressions was observed, although this bias was apparent only for those reporting high levels of trait anxiety and only when the emotional face was presented in the left visual field. This pattern was especially strong when the participants were unaware of the presence of the facial stimuli. In Experiment 3, a patient with right-hemisphere brain damage and visual extinction was presented with photographs of faces and fruits on unilateral and bilateral trials. On bilateral trials, it was found that faces produced less extinction than did fruits. Moreover, faces portraying a fearful or a happy expression tended to produce less extinction than did neutral expressions. This suggests that emotional facial expressions may be less dependent on attention to achieve awareness. The implications of these results for understanding the relations between attention, emotion, and anxiety are discussed.  相似文献   

2.
Inferences about emotions in children are limited by studies that rely on only one research method. Convergence across methods provides a stronger basis for inference by identifying method variance. This multimethod study of 116 children (mean age = 8.21 years) examined emotional displays during social exchange. Each child received a desirable gift and later an undesirable gift after performing tasks, with or without mother present. Children’s reactions were observed and coded. Children displayed more positive affect with mother present than with mother absent. Independent ratings of children by adults revealed that children lower in the personality dimension of Agreeableness displayed more negative emotion than their peers following the receipt of an undesirable gift. A curvilinear interaction between Agreeableness and mother condition predicted negative affect displays. Emotional assessment is discussed in terms of links to social exchange and the development of expressive behavior.  相似文献   

3.
A rapid response to a threatening face in a crowd is important to successfully interact in social environments. Visual search tasks have been employed to determine whether there is a processing advantage for detecting an angry face in a crowd, compared to a happy face. The empirical findings supporting the “anger superiority effect” (ASE), however, have been criticized on the basis of possible low-level visual confounds and because of the limited ecological validity of the stimuli. Moreover, a “happiness superiority effect” is usually found with more realistic stimuli. In the present study, we tested the ASE by using dynamic (and static) images of realistic human faces, with validated emotional expressions having similar intensities, after controlling the bottom-up visual saliency and the amount of image motion. In five experiments, we found strong evidence for an ASE when using dynamic displays of facial expressions, but not when the emotions were expressed by static face images.  相似文献   

4.
Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers’ discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum “felt the same” or “felt different.” In the identification task, images were presented individually and participants were asked to label the emotion displayed on the face (e.g., “Does she look happy or sad?”). Results suggest that 3.5-year-olds have the same category boundary as adults. They were more likely to report that the image pairs felt “different” at the image pair that crossed the category boundary. These results suggest that 3.5-year-olds perceive happy and sad emotional facial expressions categorically as adults do. Categorizing emotional expressions is advantageous for children if it allows them to use social information faster and more efficiently.  相似文献   

5.
People who explain why ambiguous faces are expressing anger perceive and remember those faces as angrier than do people who explain why the same faces are expressing sadness. This phenomenon may be explained by a two-stage process in which language decomposes a facial configuration into its component features, which are then reintegrated with emotion categories available in the emotion explanation itself. This configural-decomposition hypothesis is consistent with experimental results showing that the explanation effect is attenuated when configural face processing is impaired (e.g., when the faces are inverted). Ironically, although people explain emotional expressions to make more accurate attributions, the process of explanation itself can decrease accuracy by leading to perceptual assimilation of the expressions to the emotions being explained.  相似文献   

6.
From birth, infants are exposed to a wealth of emotional information in their interactions. Much research has been done to investigate the development of emotion perception, and factors influencing that development. The current study investigates the role of familiarity on 3.5-month-old infants' generalization of emotional expressions. Infants were assigned to one of two habituation sequences: in one sequence, infants were visually habituated to parental expressions of happy or sad. At test, infants viewed either a continuation of the habituation sequence, their mother depicting a novel expression, an unfamiliar female depicting the habituated expression, or an unfamiliar female depicting a novel expression. In the second sequence, a new sample of infants was matched to the infants in the first sequence. These infants viewed the same habituation and test sequences, but the actors were unfamiliar to them. Only those infants who viewed their own mothers and fathers during the habituation sequence increased looking. They dishabituated looking to maternal novel expressions, the unfamiliar female's novel expression, and the unfamiliar female depicting the habituated expression, especially when sad parental expressions were followed by an expression change to happy or to a change in person. Infants are guided in their recognition of emotional expressions by the familiarity of their parents, before generalizing to others.  相似文献   

7.
Robyn Fivush 《Sex roles》1989,20(11-12):675-691
In this study, the ways in which mothers and their 30–35-month-old children discussed the emotional aspects of past experiences was explored. Although previous research has established that children this age talk about emotions, and some studies have found sex differences between mother-daughter and mother-son dyads in these conversations, no study has examined explicitly the way in which emotions about the past are discussed. This is an important research question because emotional aspects of events may help provide an evaluative framework for thinking about and talking about the past. The results suggest that, with daughters, mothers focus more on positive emotions and tend not to attribute negative emotions to the child. With sons, positive and negative emotions are discussed equally. Moreover, mothers never discuss anger with their daughters but they do with their sons. Finally, mother-daughter conversations emphasize the emotional state itself, whereas mother-son conversations often discuss the causes and consequences of emotions. The way in which these patterns might contribute to children's developing understanding of gender-appropriate emotional reactions are discussed.  相似文献   

8.
There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Locations of fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions, participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However, in sad and angry facial expressions, the eyes received more attention than the mouth. These results confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not all facial expressions with different emotional content are decoded equally. Our data suggest that people look at regions that are most characteristic for each emotion.  相似文献   

9.
10.
Former research demonstrated that depression is associated with dysfunctional attentional processing of emotional information. Most studies examined this bias by registration of response latencies. The present study employed an ecologically valid measurement of attentive processing, using eye-movement registration. Dysphoric and non-dysphoric participants viewed slides presenting sad, angry, happy and neutral facial expressions. For each type of expression, three components of visual attention were analysed: the relative fixation frequency, fixation time and glance duration. Attentional biases were also investigated for inverted facial expressions to ensure that they were not related to eye-catching facial features. Results indicated that non-dysphoric individuals were characterised by longer fixating and dwelling on happy faces. Dysphoric individuals demonstrated a longer dwelling on sad and neutral faces. These results were not found for inverted facial expressions. The present findings are in line with the assumption that depression is associated with a prolonged attentional elaboration on negative information.  相似文献   

11.
The present study aims to explore the influence of facial emotional expressions on pre-scholars' identity recognition was analyzed using a two-alternative forced-choice matching task. A decrement was observed in children's performance with emotional faces compared with neutral faces, both when a happy emotional expression remained unchanged between the target face and the test faces and when the expression changed from happy to neutral or from neutral to happy between the target and the test faces (Experiment 1). Negative emotional expressions (i.e. fear and anger) also interfered with children's identity recognition (Experiment 2). Obtained evidence suggests that in preschool-age children, facial emotional expressions are processed in interaction with, rather than independently from, the encoding of facial identity information. The results are discussed in relationship with relevant research conducted with adults and children.  相似文献   

12.
Sleep disturbances of infancy (troubles in falling asleep, in the duration of sleep, night waking) are now the subject of renewed interest. Recently, these sleep problems in babies and young children seem to have increased in frequency and severity. A pilot study of sleep disorders is underway at the child department of the Hopital de la Poterne des Peupliers and involves more than 80 children aged 0–18 months. Twenty-nine infants have been selected from this cohort with severe and persistent sleep disorders that began in the first 18 months of life. In the absence of a control group, all cases were selected carefully. Some of the characteristics of these children are discussed: Age, sex, associated troubles, modalities of soothing, effect of hypnotic drugs, development under treatment. Some of these children are noted to be very active. Quite often, the pregnancy was marked by a traumatic event or by depression or anxiety in the mother. During evaluation of the mother-child interaction, one often notes a withdrawal of the mother's cathexis toward her child, which can go along with hyperstimulation in the relationship. This withdrawal concerns the emotional and fantasied dimensions of the interaction. The child's reactions and initiatives can be paralleled with the mother's speech, motor, and emotional attitudes. Thus, sleep disorders in early childhood represent an opportunity to observe the modalities of emotional exchange between mother and child and a methodologically interesting situation for the assessment of mother-child relationships.  相似文献   

13.
Comparison of behavioural measures of consciousness has attracted much attention recently. In a recent article, Szczepanowski et al. conclude that confidence ratings (CR) predict accuracy better than both the perceptual awareness scale (PAS) and post-decision wagering (PDW) when using stimuli with emotional content (fearful vs. neutral faces). Although we find the study interesting, we disagree with the conclusion that CR is superior to PAS because of two methodological issues. First, the conclusion is not based on a formal test. We performed this test and found no evidence that CR predicted accuracy better than PAS (p = .4). Second, Szczepanowski et al. used the present version of PAS in a manner somewhat different from how it was originally intended, and the participants may not have been adequately instructed. We end our commentary with a set of recommendations for future studies using PAS.  相似文献   

14.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness-fear and anger-disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness-fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise-sadness and excitement-disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

15.
A method is presented for obtaining a seriesofsilhouettes that were analyzed as profiles of the human face. When depressed psychiatric patients smiled before and after electroshock therapy, a greater facial displacement was recorded after treatment. Controls did not show any trend in this regard.  相似文献   

16.
17.
Posers were requested to produce happy and sad emotional expressions, deliberately accentuated on the left and right sides of the face. Raters judged the emotional intensity of expressions when presented in original and mirror-reverse orientation. Left-side-accentuated sad expressions were rated as more intense than right-side-accentuated sad expressions. Raters were biased to judge expressions as more intense when the accentuated side was to their left. The findings indicated that the perceiver bias in weighting information from the side of the face in left hemispace extends to judgments of emotional intensity.  相似文献   

18.
The recognition of nonverbal emotional signals and the integration of multimodal emotional information are essential for successful social communication among humans of any age. Whereas prior studies of age dependency in the recognition of emotion often focused on either the prosodic or the facial aspect of nonverbal signals, our purpose was to create a more naturalistic setting by presenting dynamic stimuli under three experimental conditions: auditory, visual, and audiovisual. Eighty-four healthy participants (women = 44, men = 40; age range 20-70 years) were tested for their abilities to recognize emotions either mono- or bimodally on the basis of emotional (happy, alluring, angry, disgusted) and neutral nonverbal stimuli from voice and face. Additionally, we assessed visual and auditory acuity, working memory, verbal intelligence, and emotional intelligence to explore potential explanatory effects of these population parameters on the relationship between age and emotion recognition. Applying unbiased hit rates as performance measure, we analyzed data with linear regression analyses, t tests, and with mediation analyses. We found a linear, age-related decrease in emotion recognition independent of stimulus modality and emotional category. In contrast, the improvement in recognition rates associated with audiovisual integration of bimodal stimuli seems to be maintained over the life span. The reduction in emotion recognition ability at an older age could not be sufficiently explained by age-related decreases in hearing, vision, working memory, and verbal intelligence. These findings suggest alterations in social perception at a level of complexity beyond basic perceptional and cognitive abilities.  相似文献   

19.
Of the neurobiological models of children's and adolescents' depression, the neuropsychological one is considered here. Experimental and clinical evidence has allowed us to identify a lateralization of emotional functions from the very beginning of development, and a right hemisphere dominance for emotions is by now well-known. Many studies have also correlated depression with a right hemisphere dysfunction in patients of different ages. The aim of our study was to analyze recognition of different facial emotions by a group of depressed children and adolescents. Patients affected by Major Depressive Disorder recognized less fear in six fundamental emotions than a group of healthy controls, and Dysthymic subjects recognized less anger. The group of patients' failure to recognize negative-aroused facial expressions could indicate a subtle right hemisphere dysfunction in depressed children and adolescents.  相似文献   

20.
Facial expressions of emotion are nonverbal behaviors that allow us to interact efficiently in social life and respond to events affecting our welfare. This article reviews 21 studies, published between 1932 and 2015, examining the production of facial expressions of emotion by blind people. It particularly discusses the impact of visual experience on the development of this behavior from birth to adulthood. After a discussion of three methodological considerations, the review of studies reveals that blind subjects demonstrate differing capacities for producing spontaneous expressions and voluntarily posed expressions. Seventeen studies provided evidence that blind and sighted spontaneously produce the same pattern of facial expressions, even if some variations can be found, reflecting facial and body movements specific to blindness or differences in intensity and control of emotions in some specific contexts. This suggests that lack of visual experience seems to not have a major impact when this behavior is generated spontaneously in real emotional contexts. In contrast, eight studies examining voluntary expressions indicate that blind individuals have difficulty posing emotional expressions. The opportunity for prior visual observation seems to affect performance in this case. Finally, we discuss three new directions for research to provide additional and strong evidence for the debate regarding the innate or the culture-constant learning character of the production of emotional facial expressions by blind individuals: the link between perception and production of facial expressions, the impact of display rules in the absence of vision, and the role of other channels in expression of emotions in the context of blindness.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号