首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 28 毫秒
1.
Adults perceive emotional expressions categorically, with discrimination being faster and more accurate between expressions from different emotion categories (i.e. blends with two different predominant emotions) than between two stimuli from the same category (i.e. blends with the same predominant emotion). The current study sought to test whether facial expressions of happiness and fear are perceived categorically by pre-verbal infants, using a new stimulus set that was shown to yield categorical perception in adult observers (Experiments 1 and 2). These stimuli were then used with 7-month-old infants (N = 34) using a habituation and visual preference paradigm (Experiment 3). Infants were first habituated to an expression of one emotion, then presented with the same expression paired with a novel expression either from the same emotion category or from a different emotion category. After habituation to fear, infants displayed a novelty preference for pairs of between-category expressions, but not within-category ones, showing categorical perception. However, infants showed no novelty preference when they were habituated to happiness. Our findings provide evidence for categorical perception of emotional expressions in pre-verbal infants, while the asymmetrical effect challenges the notion of a bias towards negative information in this age group.  相似文献   

2.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness–fear and anger–disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness–fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise–sadness and excitement–disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

3.
The Emotion Recognition Task is a computer-generated paradigm for measuring the recognition of six basic facial emotional expressions: anger, disgust, fear, happiness, sadness, and surprise. Video clips of increasing length were presented, starting with a neutral face that changes into a facial expression of different intensities (20%-100%). The present study describes methodological aspects of the paradigm and its applicability in healthy participants (N=58; 34 men; ages between 22 and 75), specifically focusing on differences in recognition performance between the six emotion types and age-related change. The results showed that happiness was the easiest emotion to recognize, while fear was the most difficult. Moreover, older adults performed worse than young adults on anger, sadness, fear, and happiness, but not on disgust and surprise. These findings indicate that this paradigm is probably more sensitive than emotion perception tasks using static images, suggesting it is a useful tool in the assessment of subtle impairments in emotion perception.  相似文献   

4.
Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD). Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.  相似文献   

5.
While there is an extensive literature on the tendency to mimic emotional expressions in adults, it is unclear how this skill emerges and develops over time. Specifically, it is unclear whether infants mimic discrete emotion-related facial actions, whether their facial displays are moderated by contextual cues and whether infants’ emotional mimicry is constrained by developmental changes in the ability to discriminate emotions. We therefore investigate these questions using Baby-FACS to code infants’ facial displays and eye-movement tracking to examine infants’ looking times at facial expressions. Three-, 7-, and 12-month-old participants were exposed to dynamic facial expressions (joy, anger, fear, disgust, sadness) of a virtual model which either looked at the infant or had an averted gaze. Infants did not match emotion-specific facial actions shown by the model, but they produced valence-congruent facial responses to the distinct expressions. Furthermore, only the 7- and 12-month-olds displayed negative responses to the model’s negative expressions and they looked more at areas of the face recruiting facial actions involved in specific expressions. Our results suggest that valence-congruent expressions emerge in infancy during a period where the decoding of facial expressions becomes increasingly sensitive to the social signal value of emotions.  相似文献   

6.
Do people always interpret a facial expression as communicating a single emotion (e.g., the anger face as only angry) or is that interpretation malleable? The current study investigated preschoolers' (N = 60; 3-4 years) and adults' (N = 20) categorization of facial expressions. On each of five trials, participants selected from an array of 10 facial expressions (an open-mouthed, high arousal expression and a closed-mouthed, low arousal expression each for happiness, sadness, anger, fear, and disgust) all those that displayed the target emotion. Children's interpretation of facial expressions was malleable: 48% of children who selected the fear, anger, sadness, and disgust faces for the "correct" category also selected these same faces for another emotion category; 47% of adults did so for the sadness and disgust faces. The emotion children and adults attribute to facial expressions is influenced by the emotion category for which they are looking. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

7.
The present study investigated whether facial expressions modulate visual attention in 7-month-old infants. First, infants' looking duration to individually presented fearful, happy, and novel facial expressions was compared to looking duration to a control stimulus (scrambled face). The face with a novel expression was included to examine the hypothesis that the earlier findings of greater allocation of attention to fearful as compared to happy faces could be due to the novelty of fearful faces in infants' rearing environment. The infants looked longer at the fearful face than at the control stimulus, whereas no such difference was found between the other expressions and the control stimulus. Second, a gap/overlap paradigm was used to determine whether facial expressions affect the infants' ability to disengage their fixation from a centrally presented face and shift attention to a peripheral target. It was found that infants disengaged their fixation significantly less frequently from fearful faces than from control stimuli and happy faces. Novel facial expressions did not have a similar effect on attention disengagement. Thus, it seems that adult-like modulation of the disengagement of attention by threat-related stimuli can be observed early in life, and that the influence of emotionally salient (fearful) faces on visual attention is not simply attributable to the novelty of these expressions in infants' rearing environment.  相似文献   

8.
High levels of trait hostility are associated with wide-ranging interpersonal deficits and heightened physiological response to social stressors. These deficits may be attributable in part to individual differences in the perception of social cues. The present study evaluated the ability to recognize facial emotion among 48 high hostile (HH) and 48 low hostile (LH) smokers and whether experimentally-manipulated acute nicotine deprivation moderated relations between hostility and facial emotion recognition. A computer program presented series of pictures of faces that morphed from a neutral emotion into increasing intensities of happiness, sadness, fear, or anger, and participants were asked to identify the emotion displayed as quickly as possible. Results indicated that HH smokers, relative to LH smokers, required a significantly greater intensity of emotion expression to recognize happiness. No differences were found for other emotions across HH and LH individuals, nor did nicotine deprivation moderate relations between hostility and emotion recognition. This is the first study to show that HH individuals are slower to recognize happy facial expressions and that this occurs regardless of recent tobacco abstinence. Difficulty recognizing happiness in others may impact the degree to which HH individuals are able to identify social approach signals and to receive social reinforcement.  相似文献   

9.
Rule learning (RL) is an implicit learning mechanism that allows infants to detect and generalize rule-like repetition-based patterns (such as ABB and ABA) from a sequence of elements. Increasing evidence shows that RL operates both in the auditory and the visual domain and is modulated by the perceptual expertise with the to-be-learned stimuli. Yet, whether infants’ ability to detect a high-order rule from a sequence of stimuli is affected by affective information remains a largely unexplored issue. Using a visual habituation paradigm, we investigated whether the presence of emotional expressions with a positive and a negative value (i.e., happiness and anger) modulates 7- to 8-month-old infants’ ability to learn a rule-like pattern from a sequence of faces of different identities. Results demonstrate that emotional facial expressions (either positive and negative) modulate infants’ visual RL mechanism, even though positive and negative facial expressions affect infants’ RL in a different manner: while anger disrupts infants’ ability to learn the rule-like pattern from a face sequence, in the presence of a happy face infants show a familiarity preference, thus maintaining their learning ability. These findings show that emotional expressions exert an influence on infants’ RL abilities, contributing to the investigation on how emotion and cognition interact in face processing during infancy.  相似文献   

10.
Early in the first year of life infants exhibit equivalent performance distinguishing among people within their own race and within other races. However, with development and experience, their face recognition skills become tuned to groups of people they interact with the most. This developmental tuning is hypothesized to be the origin of adult face processing biases including the other-race bias. In adults the other-race bias has also been associated with impairments in facial emotion processing for other-race faces. The present investigation aimed to show perceptual narrowing for other-race faces during infancy and to determine whether the race of a face influences infants' ability to match emotional sounds with emotional facial expressions. Behavioral (visual-paired comparison; VPC) and electrophysiological (event-related potentials; ERPs) measures were recorded in 5-month-old and 9-month-old infants. Behaviorally, 5-month-olds distinguished faces within their own race and within another race, whereas 9-month-olds only distinguish faces within their own race. ERPs were recorded while an emotion sound (laughing or crying) was presented prior to viewing an image of a static African American or Caucasian face expressing either a happy or a sad emotion. Consistent with behavioral findings, ERPs revealed race-specific perceptual processing of faces and emotion/sound face congruency at 9 months but not 5 months of age. In addition, from 5 to 9 months, the neural networks activated for sound/face congruency were found to shift from an anterior ERP component (Nc) related to attention to posterior ERP components (N290, P400) related to perception.  相似文献   

11.
Recent research has indicated that infants are capable of responding to stimuli in a manner indicating that they categorize them. Infant perception of orientation was examined within a framework of categorization. In one experiment, it was shown that 4-month-old infants generalized habituation from one range of oblique grating stimuli to another, consistent with the interpretation that any two oblique stimuli were perceived as more similar than a vertical and an oblique. Four-month-old infants' generalization was not due to a simple inability to discriminate between obliques (Experiment 2) so the results of Experiment 1 reflect in large part true categorization behavior and not categorical perception. Results for 2- and 3-month old infants suggest that "vertical" serves as a reference stimulus in infant orientation perception such that gross distinctions between vertical and nonvertical precede the development of the "oblique" category. The category boundary between oblique and vertical did not successfully predict better between-than within-category discrimination in 4-month-old infants (Experiment 3) under the conditions of these experiments.  相似文献   

12.
This experiment investigated social referencing as a form of discriminative learning in which maternal facial expressions signaled the consequences of the infant's behavior in an ambiguous context. Eleven 4- and 5-month-old infants and their mothers participated in a discrimination-training procedure using an ABAB design. Different consequences followed infants' reaching toward an unfamiliar object depending on the particular maternal facial expression. During the training phases, a joyful facial expression signaled positive reinforcement for the infant reaching for an ambiguous object, whereas a fearful expression signaled aversive stimulation for the same response. Baseline and extinction conditions were implemented as controls. Mothers' expressions acquired control over infants' approach behavior for all participants. All participants ceased to show discriminated responding during the extinction phase. The results suggest that 4- and 5-month-old infants can learn social referencing via discrimination training.  相似文献   

13.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness-fear and anger-disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness-fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise-sadness and excitement-disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

14.
Multi-label tasks confound age differences in perceptual and cognitive processes. We examined age differences in emotion perception with a technique that did not require verbal labels. Participants matched the emotion expressed by a target to two comparison stimuli, one neutral and one emotional. Angry, disgusted, fearful, happy, and sad facial expressions of varying intensity were used. Although older adults took longer to respond than younger adults, younger adults only outmatched older adults for the lowest intensity disgust and fear expressions. Some participants also completed an identity matching task in which target stimuli were matched on personal identity instead of emotion. Although irrelevant to the judgment, expressed emotion still created interference. All participants were less accurate when the apparent difference in expressive intensity of the matched stimuli was large, suggesting that salient emotion cues increased difficulty of identity matching. Age differences in emotion perception were limited to very low intensity expressions.  相似文献   

15.
Recognition of facial affect in Borderline Personality Disorder   总被引:1,自引:0,他引:1  
Patients with Borderline Personality Disorder (BPD) have been described as emotionally hyperresponsive, especially to anger and fear in social contexts. The aim was to investigate whether BPD patients are more sensitive but less accurate in terms of basic emotion recognition, and show a bias towards perceiving anger and fear when evaluating ambiguous facial expressions. Twenty-five women with BPD were compared with healthy controls on two different facial emotion recognition tasks. The first task allowed the assessment of the subjective detection threshold as well as the number of evaluation errors on six basic emotions. The second task assessed a response bias to blends of basic emotions. BPD patients showed no general deficit on the affect recognition task, but did show enhanced learning over the course of the experiment. For ambiguous emotional stimuli, we found a bias towards the perception of anger in the BPD patients but not towards fear. BPD patients are accurate in perceiving facial emotions, and are probably more sensitive to familiar facial expressions. They show a bias towards perceiving anger, when socio-affective cues are ambiguous. Interpersonal training should focus on the differentiation of ambiguous emotion in order to reduce a biased appraisal of others.  相似文献   

16.
Older adults perceive less intense negative emotion in facial expressions compared to younger counterparts. Prior research has also demonstrated that mood alters facial emotion perception. Nevertheless, there is little evidence which evaluates the interactive effects of age and mood on emotion perception. This study investigated the effects of sad mood on younger and older adults’ perception of emotional and neutral faces. Participants rated the intensity of stimuli while listening to sad music and in silence. Measures of mood were administered. Younger and older participants’ rated sad faces as displaying stronger sadness when they experienced sad mood. While younger participants showed no influence of sad mood on happiness ratings of happy faces, older adults rated happy faces as conveying less happiness when they experienced sad mood. This study demonstrates how emotion perception can change when a controlled mood induction procedure is applied to alter mood in young and older participants.  相似文献   

17.
Recognising emotions from faces that are partly covered is more difficult than from fully visible faces. The focus of the present study is on the role of an Islamic versus non-Islamic context, i.e. Islamic versus non-Islamic headdress in perceiving emotions. We report an experiment that investigates whether briefly presented (40?ms) facial expressions of anger, fear, happiness and sadness are perceived differently when covered by a niqāb or turban, compared to a cap and shawl. In addition, we examined whether oxytocin, a neuropeptide regulating affection, bonding and cooperation between ingroup members and fostering outgroup vigilance and derogation, would differentially impact on emotion recognition from wearers of Islamic versus non-Islamic headdresses. The results first of all show that the recognition of happiness was more accurate when the face was covered by a Western compared to Islamic headdress. Second, participants more often incorrectly assigned sadness to a face covered by an Islamic headdress compared to a cap and shawl. Third, when correctly recognising sadness, they did so faster when the face was covered by an Islamic compared to Western headdress. Fourth, oxytocin did not modulate any of these effects. Implications for theorising about the role of group membership on emotion perception are discussed.  相似文献   

18.
Abrupt discontinuities in recognizing categories of emotion are found for the labelling of consciously perceived facial expressions. This has been taken to imply that, at a conscious level, we perceive facial expressions categorically. We investigated whether the abrupt discontinuities found in categorization for conscious recognition would be replaced by a graded transition for subthreshold stimuli. Fifteen volunteers participated in two experiments, in which participants viewed faces morphed from 100% fear to 100% disgust along seven increments. In Experiment A, target faces were presented for 30 ms, in Experiment B for 170 ms. Participants made two-alternative forced-choice decisions between fear and disgust. Results for the 30 ms presentation time indicated a significant linear trend between degree of morphing and classification of the images. Results for 170 ms presentation time followed the higher order function found in studies of categorical perception. These results provide preliminary evidence for separate processes underlying conscious and nonconscious perception of facial expressions of emotion.  相似文献   

19.
Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers’ discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum “felt the same” or “felt different.” In the identification task, images were presented individually and participants were asked to label the emotion displayed on the face (e.g., “Does she look happy or sad?”). Results suggest that 3.5-year-olds have the same category boundary as adults. They were more likely to report that the image pairs felt “different” at the image pair that crossed the category boundary. These results suggest that 3.5-year-olds perceive happy and sad emotional facial expressions categorically as adults do. Categorizing emotional expressions is advantageous for children if it allows them to use social information faster and more efficiently.  相似文献   

20.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号