首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
When we observe someone shift their gaze to a peripheral event or object, a corresponding shift in our own attention often follows. This social orienting response, joint attention, has been studied in the laboratory using the gaze cueing paradigm. Here, we investigate the combined influence of the emotional content displayed in two critical components of a joint attention episode: The facial expression of the cue face, and the affective nature of the to-be-localized target object. Hence, we presented participants with happy and disgusted faces as cueing stimuli, and neutral (Experiment 1), pleasant and unpleasant (Experiment 2) pictures as target stimuli. The findings demonstrate an effect of ‘emotional context’ confined to participants viewing pleasant pictures. Specifically, gaze cueing was boosted when the emotion of the gazing face (i.e., happy) matched that of the targets (pleasant). Demonstrating modulation by emotional context highlights the vital flexibility that a successful joint attention system requires in order to assist our navigation of the social world.  相似文献   

2.
To evaluate whether there is an early attentional bias towards negative stimuli, we tracked participants' eyes while they passively viewed displays composed of four Ekman faces. In Experiment 1 each display consisted of three neutral faces and one face depicting fear or happiness. In half of the trials, all faces were inverted. Although the passive viewing task should have been very sensitive to attentional biases, we found no evidence that overt attention was biased towards fearful faces. Instead, people tended to actively avoid looking at the fearful face. This avoidance was evident very early in scene viewing, suggesting that the threat associated with the faces was evaluated rapidly. Experiment 2 replicated this effect and extended it to angry faces. In sum, our data suggest that negative facial expressions are rapidly analysed and influence visual scanning, but, rather than attract attention, such faces are actively avoided.  相似文献   

3.
Facial expressions are a basic form of non-verbal communication that convey important social information to others. The relevancy of this information is highlighted by findings that backward masked facial expressions facilitate spatial attention. This attention effect appears to be mediated through a neural network consisting of the amygdala, anterior cingulate, and visual cortex. However, a direct investigation of the neural time course associated with orienting to such stimuli has yet to be performed. In the current investigation, a backward masked fearful face dot-probe task was performed while ERPs were recorded. Reaction time results suggest that spatial attention is captured by backward masked fearful faces and attention is focused at the location of the fear stimulus. Masked right visual field fearful faces enhanced the N170 amplitudes of contralateral occipito-temporal electrodes. The rapid contralateral N170 enhancement was positively correlated with participants’ behavioral index of spatial attention. Thus, backward masked fearful face-elicited spatial attention facilitates behavior and modulates the early stage of facial processing reflected by the N170.  相似文献   

4.
Research has shown that anger faces represent a potent motivational incentive for individuals with high implicit power motive (nPower). However, it is well known that anger expressions can vary in intensity, ranging from mild anger to rage. To examine nPower-relevant emotional intensity processing in anger faces, an ERP oddball task with facial stimuli was utilized, with neutral expressions as the standard and targets varying on anger intensity (50%, 100%, or 150% emotive). Thirty-one college students participated in the experiment (15 low and 16 high nPower persons determined by the Picture Story Exercise, PSE). In comparison with low nPower persons, higher percentage of correct responses was observed for high nPower persons when both groups discriminated low-intensity (50% intensity) anger faces from neutral faces. ERPs between 100% and 150% anger expressions revealed that high-intensity (150%) anger expressions elicited larger P3a and late positive potential (LPP) amplitudes relative to prototypical (100% intensity) anger expressions for power-motivated individuals. Conversely, low nPower participants showed no differences at both P3a and LPP components. These findings demonstrate that persons with high nPower are sensitive to intensity changes in anger faces and their sensitivity increases with the intensity of anger faces.  相似文献   

5.
Some theories of emotion emphasise a close relationship between interoception and subjective experiences of emotion. In this study, we used facial expressions to examine whether interoceptive sensibility modulated emotional experience in a social context. Interoceptive sensibility was measured using the heartbeat detection task. To estimate individual emotional sensitivity, we made morphed photos that ranged between a neutral and an emotional facial expression (i.e., anger, sadness, disgust and happy). Recognition rates of particular emotions from these photos were calculated and considered as emotional sensitivity thresholds. Our results indicate that participants with accurate interoceptive awareness are sensitive to the emotions of others, especially for expressions of sadness and happy. We also found that false responses to sad faces were closely related with an individual's degree of social anxiety. These results suggest that interoceptive awareness modulates the intensity of the subjective experience of emotion and affects individual traits related to emotion processing.  相似文献   

6.
Happy, surprised, disgusted, angry, sad, fearful, and neutral facial expressions were presented extrafoveally (2.5° away from fixation) for 150 ms, followed by a probe word for recognition (Experiment 1) or a probe scene for affective valence evaluation (Experiment 2). Eye movements were recorded and gaze-contingent masking prevented foveal viewing of the faces. Results showed that (a) happy expressions were recognized faster than others in the absence of fixations on the faces, (b) the same pattern emerged when the faces were presented upright or upside-down, (c) happy prime faces facilitated the affective evaluation of emotionally congruent probe scenes, and (d) such priming effects occurred at 750 but not at 250 ms prime–probe stimulus–onset asynchrony. This reveals an advantage in the recognition of happy faces outside of overt visual attention, and suggests that this recognition advantage relies initially on featural processing and involves processing of positive affect at a later stage.  相似文献   

7.
We demonstrate non-conscious processing beyond valence by employing the masked emotional priming paradigm (Rohr, Degner, & Wentura, 2012) with a stimulus-onset asynchrony (SOA) variation. Emotional faces were briefly presented and directly masked, followed by the target face, using a SOA of either 43 ms or 143 ms. Targets were categorized as happy, angry, fearful, or sad. With short SOA, we replicated the differentiated priming effect within the negative domain (i.e., angry differentiate from fearful/sad). A direct test of prime awareness indicated that primes could not be discriminated consciously in this condition. With long SOA, however, we did not observe the priming effect whereas the direct test indicated some degree of conscious processing. Thus, indirect effects dissociated from direct effects in our study, an indication for non-conscious processing. Thereby, the present study provides evidence for non-conscious processing of emotional information beyond a simple positive-negative differentiation.  相似文献   

8.
An important question in neuroscience is which multisensory information, presented outside of awareness, can influence the nature and speed of conscious access to our percepts. Recently, proprioceptive feedback of the hand was reported to lead to faster awareness of congruent hand images in a breaking continuous flash suppression (b-CFS) paradigm. Moreover, a vast literature suggests that spontaneous facial mimicry can improve emotion recognition, even without awareness of the stimulus face. However, integration of visual and proprioceptive information about the face to date has not been tested with CFS. The modulation of visual awareness of emotional faces by facial proprioception was investigated across three separate experiments. Face proprioception was induced with voluntary facial expressions or with spontaneous facial mimicry. Frequentist statistical analyses were complemented with Bayesian statistics. No evidence of multisensory integration was found, suggesting that proprioception does not modulate access to visual awareness of emotional faces in a CFS paradigm.  相似文献   

9.
Increasing evidence indicates that evaluation of affective stimuli facilitates the execution of affect-congruent approach and avoidance responses, and vice versa. These effects are proposed to be mediated by increases or decreases in the relative distance to the stimulus, due to the participant's action. In a series of experiments we investigated whether stimulus categorisation is similarly influenced when changes in this relative distance are due to movement of the stimulus instead of movements by the participant. Participants responded to happy and angry faces that appeared to approach (move towards) or withdraw (move away) from them. In line with previous findings, affective categorisation was facilitated when the movement was congruent with stimulus valence, resulting in faster and more correct responses to approaching happy and withdrawing angry faces. These findings suggest that relative distance indeed plays a crucial role in approach–avoidance congruency effects, and that these effects do not depend on the execution of movements by the participant.  相似文献   

10.
Facial stimulus processing is an important topic to explain how people comprehend affective disposition in others. The effect of attentive and pre-attentive elaboration of emotional facial expression was explored in the present research by using backward masking procedure. Specifically, unconscious mental process of emotion comprehension was analyzed: pictures presenting a happy, sad, angry, fearful, disgusted, surprised expressions were submitted to 21 subjects in both attentive and pre-attentive conditions and event-related potentials (ERPs) were registered in the two conditions. The two processes, attentive and pre-attentive, seem to be similar in their nature, since they are marked by analogous ERP deflections. In fact, two ERP effects were found, a positive (P300) deflection, maximally distributed on the parietal regions, and a negative (N200) deflection, more localized on the frontal sites. Nevertheless, some differences between the two conditions were found in terms of quantitative modulations of the two peaks. The N200 effect, ampler in attentive condition, may be considered such as an index of conscious processing of emotional faces, whereas the P3 (P3a) effect, higher in pre-attentive condition, was considered a specific marker of the automatic, unconscious process during the emotional face comprehension.  相似文献   

11.
Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non‐emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD.  相似文献   

12.
In daily experience, children have access to a variety of cues to others’ emotions, including face, voice, and body posture. Determining which cues they use at which ages will help to reveal how the ability to recognize emotions develops. For happiness, sadness, anger, and fear, preschoolers (3-5 years, N = 144) were asked to label the emotion conveyed by dynamic cues in four cue conditions. The Face-only, Body Posture-only, and Multi-cue (face, body, and voice) conditions all were well recognized (M > 70%). In the Voice-only condition, recognition of sadness was high (72%), but recognition of the three other emotions was significantly lower (34%).  相似文献   

13.
Using the item-method directed forgetting paradigm (i.e. intentionally forgetting specified information), we examined directed forgetting of facial identity as a function of facial expression and the sex of the expresser and perceiver. Participants were presented with happy and angry male and female faces cued for either forgetting or remembering, and were then asked to recognise previously studied faces from among a series of neutral faces. For each recognised test face, participants also recalled the face’s previously displayed emotional expression. We found that angry faces were more resistant to forgetting than were happy faces. Furthermore, angry expressions on male faces and happy expressions on female faces were recognised and recalled better than vice versa. Signal detection analyses revealed that male faces gave rise to a greater sensitivity than female faces did, and male participants, but not female participants, showed greater sensitivity to male faces than to female faces. Several theoretical implications are discussed.  相似文献   

14.
Schizophrenia-spectrum disorders are characterized by deficits in social domains. Extant research has reported an impaired ability to perceive emotional faces in schizophrenia. Yet, it is unclear if these deficits occur already in the access to visual awareness. To investigate this question, 23 people with schizophrenia or schizoaffective disorder and 22 healthy controls performed a breaking continuous flash suppression task with fearful, happy, and neutral faces. Response times were analysed with generalized linear mixed models. People with schizophrenia-spectrum disorders were slower than controls in detecting faces, but did not show emotion-specific impairments. Moreover, happy faces were detected faster than neutral and fearful faces, across all participants. Although caution is needed when interpreting the main effect of group, our findings may suggest an elevated threshold for visual awareness in schizophrenia-spectrum disorders, but an intact implicit emotion perception. Our study provides a new insight into the mechanisms underlying emotion perception in schizophrenia-spectrum disorders.  相似文献   

15.
Several studies investigated the role of featural and configural information when processing facial identity. A lot less is known about their contribution to emotion recognition. In this study, we addressed this issue by inducing either a featural or a configural processing strategy (Experiment 1) and by investigating the attentional strategies in response to emotional expressions (Experiment 2). In Experiment 1, participants identified emotional expressions in faces that were presented in three different versions (intact, blurred, and scrambled) and in two orientations (upright and inverted). Blurred faces contain mainly configural information, and scrambled faces contain mainly featural information. Inversion is known to selectively hinder configural processing. Analyses of the discriminability measure (A′) and response times (RTs) revealed that configural processing plays a more prominent role in expression recognition than featural processing, but their relative contribution varies depending on the emotion. In Experiment 2, we qualified these differences between emotions by investigating the relative importance of specific features by means of eye movements. Participants had to match intact expressions with the emotional cues that preceded the stimulus. The analysis of eye movements confirmed that the recognition of different emotions rely on different types of information. While the mouth is important for the detection of happiness and fear, the eyes are more relevant for anger, fear, and sadness.  相似文献   

16.
Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.  相似文献   

17.
This review focuses on facial asymmetries during emotional expression. Facial asymmetry is defined as the expression intensity or muscular involvement on one side of the face (“hemiface”) relative to the other side and has been used as a behavioral index of hemispheric specialization for facial emotional expression. This paper presents a history of the neuropsychological study of facial asymmetry, originating with Darwin. Both quantitative and qualitative aspects of asymmetry are addressed. Next, neuroanatomical bases for facial expression are elucidated, separately for posed/voluntary and spontaneous/involuntary elicitation conditions. This is followed by a comprehensive review of 49 experiments of facial asymmetry in the adult literature, oriented around emotional valence (pleasantness/unpleasantness), elicitation condition, facial part, social display rules, and demographic factors. Results of this review indicate that the left hemiface is more involved than the right hemiface in the expression of facial emotion. From a neuropsychological perspective, these findings implicate the right cerebral hemisphere as dominant for the facial expression of emotion. In spite of the compelling evidence for right-hemispheric specialization, some data point to the possibility of differential hemispheric involvement as a function of emotional valence. An earlier version of this paper by the first author was presented at the XV Annual Symposium of the Society of Craniofacial Genetics, July 12, 1992, Stanford University, Palo Alto, CA.  相似文献   

18.
The intention to execute a movement can modulate our perception of sensory events, and this modulation is observed ahead of both ocular and upper limb movements. However, theoretical accounts of these effects, and also the empirical data, are often contradictory. Accounts of “active touch”, and the premotor theory of attention, have emphasized how movement intention leads to enhanced perceptual processing at the target of a movement, or on the to-be-moved effector. By contrast, recent theories of motor control emphasize how internal “forward” model (FM) estimates may be used to cancel or attenuate sensory signals that arise as a result of self-generated movements. We used behavioural and functional brain imaging (functional magnetic resonance imaging, fMRI) to investigate how perception of a somatosensory stimulus differed according to whether it was delivered to a hand that was about to execute a reaching movement or the alternative, nonmoving, hand. The results of our study demonstrate that a somatosensory stimulus delivered to a hand that is being prepared for movement is perceived to have occurred later than when that same stimulus is delivered to a nonmoving hand. This result indicates that it takes longer for a tactile stimulus to be detected when it is delivered to a moving limb and may correspond to a change in perceptual threshold. Our behavioural results are paralleled by the results of our fMRI study that demonstrated that there were significantly reduced blood-oxygen-level-dependent (BOLD) responses within the parietal operculum and insula following somatosensory stimulation of the hand being prepared for movement, compared to when an identical stimulus was delivered to a nonmoving hand. These findings are consistent with the prediction of FM accounts of motor control that postulate that central sensory suppression of somatosensation accompanies self-generated limb movements, and with previous reports indicating that effects of sensory suppression are observed in higher order somatosensory regions.  相似文献   

19.
The medial prefrontal cortex (mPFC) has been implicated in attending to one’s own emotional states, but the role of emotional valence in this context is not understood. We examined valence-specific BOLD activity in a previously validated functional magnetic resonance imaging (fMRI) paradigm. Ten healthy subjects viewed emotional pictures and categorized their experience as pleasant, unpleasant or neutral. All three categories activated a common region within mPFC. Subtraction of neutral from pleasant or unpleasant conditions instead revealed ventromedial PFC (vmPFC), suggesting that this region represents emotional valence. During exteroceptive attention, greater mPFC responses were observed in response to emotional relative to neutral stimuli, consistent with studies implicating mPFC in the top-down modulation of emotion-biased attention. These findings may help to integrate the two proposed roles of mPFC in emotional representation and top-down modulation of subcortical structures.  相似文献   

20.
In this review, we survey the state of the field of functional magnetic resonance imaging (fMRI) as it relates to drug discovery and drug development. We highlight the advantages and limitations of fMRI for this purpose and suggest ways to improve the use of fMRI for developing new therapeutics, with emphasis on treatments for anxiety disorders. Fundamentally, pharmacological studies with standard psychiatric treatments using standardized behavioral probes during fMRI will need to be carried out to determine characteristic brain signatures that could be used to predict whether novel compounds are likely to have specific therapeutic effects.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号