首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The aim was to explore whether people high as opposed to low in speech anxiety react with a more pronounced differential facial response when exposed to angry and happy facial stimuli. High and low fear participants were selected based on their scores on a fear of public speaking questionnaire. All participants were exposed to pictures of angry and happy faces while facial electromyographic (EMG) activity from the Corrugator supercilii and the Zygomaticus major muscle regions was recorded. Skin conductance responses (SCR), heart rate (HR) and ratings were also collected. Participants high as opposed to low in speech anxiety displayed a larger differential corrugator responding, indicating a larger negative emotional reaction, between angry and happy faces. They also reacted with a larger differential zygomatic responding, indicating a larger positive emotional reaction, between happy and angry faces. Consistent with the facial reaction patterns, the high fear group rated angry faces as more unpleasant and as expressing more disgust, and further rated happy faces as more pleasant. There were no differences in SCR or HR responding between high and low speech anxiety groups. The present results support the hypothesis that people high in speech anxiety are disposed to show an exaggerated sensitivity and facial responsiveness to social stimuli.  相似文献   

2.
Although facial information is distributed over spatial as well as temporal domains, thus far research on selective attention to disapproving faces has concentrated predominantly on the spatial domain. This study examined the temporal characteristics of visual attention towards facial expressions by presenting a Rapid Serial Visual Presentation (RSVP) paradigm to high (n=33) and low (n=34) socially anxious women. Neutral letter stimuli (p, q, d, b) were presented as the first target (T1), and emotional faces (neutral, happy, angry) as the second target (T2). Irrespective of social anxiety, the attentional blink was attenuated for emotional faces. Emotional faces as T2 did not influence identification accuracy of a preceding (neutral) target. The relatively low threshold for the (explicit) identification of emotional expressions is consistent with the view that emotional facial expressions are processed relatively efficiently.  相似文献   

3.
Event-related brain potentials were measured in 7- and 12-month-old infants to examine the development of processing happy and angry facial expressions. In 7-month-olds a larger negativity to happy faces was observed at frontal, central, temporal and parietal sites (Experiment 1), whereas 12-month-olds showed a larger negativity to angry faces at occipital sites (Experiment 2). These data suggest that processing of these facial expressions undergoes development between 7 and 12 months: while 7-month-olds exhibit heightened sensitivity to happy faces, 12-month-olds resemble adults in their heightened sensitivity to angry faces. In Experiment 3 infants' visual preference was assessed behaviorally, revealing that the differences in ERPs observed at 7 and 12 months do not simply reflect differences in visual preference.  相似文献   

4.
Recent studies have shown that reaction times to expressions of anger with averted gaze and fear with direct gaze appear slower than those to direct anger and averted fear. Such findings have been explained by appealing to the notion of gaze/expression congruence with aversion (avoidance) associated with fear, whereas directness (approach) is associated with anger. The current study examined reactions to briefly presented direct and averted faces displaying expressions of fear and anger. Participants were shown four blocked series of faces; each block contained an equal mix of two facial expressions (neutral plus either fear or anger) presented at one viewpoint (either full face or three quarter leftward facing). Participants were instructed to make rapid responses classifying the expressions as either neutral or expressive. Initial analysis of reaction time distributions showed differences in distribution shape with reactions to averted anger and direct fear showing greater skew than those to direct anger and averted fear. Computational modelling, using a diffusion model of decision making and reaction time, showed a difference in the rate of information accrual with more rapid rates of accrual when viewpoint and expression were congruent. This analysis supports the notion of signal congruence as a mechanism through which gaze and viewpoint affect our responses to facial expressions.  相似文献   

5.
Using 20 levels of intensity, we measured children’s thresholds to discriminate the six basic emotional expressions from neutral and their misidentification rates. Combined with the results of a previous study using the same method (Journal of Experimental Child Psychology, 102 (2009) 503-521), the results indicate that by 5 years of age, children are adult-like, or nearly adult-like, for happy expressions on all measures. Children’s sensitivity to other expressions continues to improve between 5 and 10 years of age (e.g., surprise, disgust, fear) or even after 10 years of age (e.g., anger, sad). The results indicate that there is a slow development of sensitivity to the expression of all basic emotions except happy. This slow development may impact children’s social and cognitive development by limiting their sensitivity to subtle expressions of disapproval or disappointment.  相似文献   

6.
孙俊才  石荣 《心理学报》2017,(2):155-163
研究采用双选择Oddball范式和线索-靶子范式,并结合眼动技术,以微笑、哭泣和中性表情面孔为刺激材料,综合考察哭泣表情面孔在识别和解离过程中的注意偏向。研究发现:在识别阶段,哭泣表情面孔的识别正确率和反应速度都显著优于微笑表情面孔,进一步的兴趣区注视偏向分析发现,哭泣和微笑表情面孔的注视模式既具有一致的规律,又存在细微的差异;在解离阶段,返回抑制受线索表情类型的影响,在有效线索条件下,哭泣表情线索呈现后个体对目标刺激的平均注视时间和眼跳潜伏期都显著短于其它表情线索。表明哭泣表情面孔在识别和解离过程中具有不同的注意偏向表现,在识别阶段表现为反应输出优势和注视模式上的一致性与差异性;在解离阶段表现为有效线索条件下,对目标刺激定位和视觉加工的促进作用。  相似文献   

7.
Prior reports of preferential detection of emotional expressions in visual search have yielded inconsistent results, even for face stimuli that avoid obvious expression-related perceptual confounds. The current study investigated inconsistent reports of anger and happiness superiority effects using face stimuli drawn from the same database. Experiment 1 excluded procedural differences as a potential factor, replicating a happiness superiority effect in a procedure that previously yielded an anger superiority effect. Experiments 2a and 2b confirmed that image colour or poser gender did not account for prior inconsistent findings. Experiments 3a and 3b identified stimulus set as the critical variable, revealing happiness or anger superiority effects for two partially overlapping sets of face stimuli. The current results highlight the critical role of stimulus selection for the observation of happiness or anger superiority effects in visual search even for face stimuli that avoid obvious expression related perceptual confounds and are drawn from a single database.  相似文献   

8.
Increasing evidence indicates that eye gaze direction affects the processing of emotional faces in anxious individuals. However, the effects of eye gaze direction on the behavioral responses elicited by emotional faces, such as avoidance behavior, remain largely unexplored. We administered an Approach-Avoidance Task (AAT) in high (HSA) and low socially anxious (LSA) individuals. All participants responded to photographs of angry, happy and neutral faces (presented with direct and averted gaze), by either pushing a joystick away from them (avoidance) or pulling it towards them (approach). Compared to LSA, HSA were faster in avoiding than approaching angry faces. Most crucially, this avoidance tendency was only present when the perceived anger was directed towards the subject (direct gaze) and not when the gaze of the face-stimulus was averted. In contrast, HSA individuals tended to avoid happy faces irrespectively of gaze direction. Neutral faces elicited no approach-avoidance tendencies. Thus avoidance of angry faces in social anxiety as measured by AA-tasks reflects avoidance of subject-directed anger and not of negative stimuli in general. In addition, although both anger and joy are considered to reflect approach-related emotions, gaze direction did not affect HSA's avoidance of happy faces, suggesting differential mechanisms affecting responses to happy and angry faces in social anxiety.  相似文献   

9.
The important ability to discriminate facial expressions of emotion develops early in human ontogeny. In the present study, 7-month-old infants’ event-related potentials (ERPs) in response to angry and fearful emotional expressions were measured. The angry face evoked a larger negative component (Nc) at fronto-central leads between 300 and 600 ms after stimulus onset when compared to the amplitude of the Nc to the fearful face. Furthermore, over posterior channels, the angry expression elicited a N290 that was larger in amplitude and a P400 that was smaller in amplitude than for the fearful expression. This is the first study that shows that the ability of infants to discriminate angry and fearful facial expressions can be measured at the electrophysiological level. These data suggest that 7-month-olds allocated more attentional resources to the angry face as indexed by the Nc. Implications of this result may be that the social signal values were perceived differentially, not merely as “negative”. Furthermore, it is possible that the angry expression might have been more arousing and discomforting for the infant compared with the fearful expression.  相似文献   

10.
In this study, the individual proneness to anger, either as an experienced emotion or as action readiness, was studied in university students of both sexes in Japan and Spain, administering the Anger Situation Questionnaire (ASQ) to 425 subjects (195 in Japan and 230 in Spain). The feelings of anger experience were higher than the readiness to action in all the samples. This difference between both variables was higher in females than in males, as well as in the Spanish sample compared with the Japanese one. An intragroup analysis in each sex in each country showed that the relative differences between males and females were similar in both countries. These constant differences between both sexes seem to be independent of the cultural context. Aggr. Behav. 28:429–438, 2002. © 2002 Wiley‐Liss, Inc.  相似文献   

11.
Numerous studies have shown an exacerbation of attentional bias towards threat in anxiety states. However, the cognitive mechanisms responsible for these attentional biases remain largely unknown. Further, the authors outline the need to consider the nature of the attentional processes in operation (hypervigilance, avoidance, or disengagement). We adapted a dot-probe paradigm to record behavioral and electrophysiological responses in 26 participants reporting high or low fear of evaluation, a major component of social anxiety. Pairs of faces including a neutral and an emotional face (displaying anger, fear, disgust, or happiness) were presented during 200 ms and then replaced by a neutral target to discriminate. Results show that anxious participants were characterized by an increased P1 in response to pairs of faces, irrespective of the emotional expression included in the pair. They also showed an increased P2 in response to angry–neutral pairs selectively. Finally, in anxious participants, the P1 response to targets was enhanced when replacing emotional faces, whereas non-anxious subjects showed no difference between the two conditions. These results indicate an early hypervigilance to face stimuli in social anxiety, coupled with difficulty in disengaging from threat and sustained attention to emotional stimuli. They are discussed within the framework of current models of anxiety and psychopathology.  相似文献   

12.
13.
ABSTRACT

We investigated whether attractiveness ratings of expressive faces would be affected by gaze shifts towards or away from the observer. In all experiments, effects of facial expression were found, with higher attractiveness ratings to positive over negative expressions, irrespective of effects of gaze-shifts. In the first experiment faces with gaze shifts away from the observer were preferred. However, when the dynamics of the gaze shift was disrupted, by adding an intermediate delay, the effect of direction of gaze shift disappeared. By manipulating the relative duration of each gaze direction during a gaze shift we found higher attractiveness ratings to faces with a longer duration of direct gaze, particularly in the initial exposure to a face. Our findings suggest that although the temporal dynamics of eye gaze and facial expressions influence the aesthetic evaluation of faces, these cues appear to act independently rather than in an integrated manner for social perception.  相似文献   

14.
A small body of research suggests that socially anxious individuals show biases in interpreting the facial expressions of others. The current study included a clinically anxious sample in a speeded emotional card-sorting task in two conditions (baseline and threat) to investigate several hypothesized biases in interpretation. Following the threat manipulation, participants with generalized social anxiety disorders (GSADs) sorted angry cards with greater accuracy, but also evidenced a greater rate of neutral cards misclassified as angry, as compared to nonanxious controls. The controls showed the opposite pattern, sorting neutral cards with greater accuracy but also misclassifying a greater proportion of angry cards as neutral, as compared to GSADs. These effects were accounted for primarily by low-intensity angry cards. Results are consistent with previous studies showing a negative interpretive bias, and can be applied to the improvement of clinical interventions.  相似文献   

15.
PurposeEvent-related brain potentials (ERPs) were used to investigate the neural correlates of emotion processing in 5- to 8-year-old children who do and do not stutter.MethodsParticipants were presented with an audio contextual cue followed by images of threatening (angry/fearful) and neutral facial expressions from similarly aged peers. Three conditions differed in audio-image pairing: neutral context-neutral expression (neutral condition), negative context-threatening expression (threat condition), and reappraisal context-threatening expression (reappraisal condition). These conditions reflected social stimuli that are ecologically valid to the everyday life of children.ResultsP100, N170, and late positive potential (LPP) ERP components were elicited over parietal and occipital electrodes. The threat condition elicited an increased LPP mean amplitude compared to the neutral condition across our participants, suggesting increased emotional reactivity to threatening facial expressions. In addition, LPP amplitude decreased during the reappraisal condition— evidence of emotion regulation. No group differences were observed in the mean amplitude of ERP components between children who do and do not stutter. Furthermore, dimensions of childhood temperament and stuttering severity were not strongly correlated with LPP elicitation.ConclusionThese findings are suggestive that, at this young age, children who stutter exhibit typical brain activation underlying emotional reactivity and regulation to social threat from peer facial expressions.  相似文献   

16.
The Approach–Avoidance Task (AAT) was employed to indirectly investigate avoidance reactions to stimuli of potential social threat. Forty-three highly socially anxious individuals (HSAs) and 43 non-anxious controls (NACs) reacted to pictures of emotional facial expressions (angry, neutral, or smiling) or to control pictures (puzzles) by pulling a joystick towards themselves (approach) versus pushing it away from themselves (avoidance). HSAs showed stronger avoidance tendencies than NACs for smiling as well as angry faces, whereas no group differences were found for neutral faces and puzzles. In contrast, valence ratings of the emotional facial expressions did not differ between groups. A critical discrepancy between direct and indirect measures was observed for smiling faces: HSAs evaluated them positively, but reacted to them with avoidance.  相似文献   

17.
Women’s cradling side preference has been related to contralateral hemispheric specialization of processing emotional signals; but not of processing baby’s facial expression. Therefore, 46 nulliparous female volunteers were characterized as left or non-left holders (HG) during a doll holding task. During a signal detection task they were then asked to detect the emotional baby faces in a series of baby portraits with neutral and emotional facial expressions, presented either to the left or the right visual field (VFP). ANOVA revealed a significant HG × VFP interaction on response bias data (p < .05). Response bias was lowest when emotional baby faces were presented in the visual field of cradling side preference, suggesting that women’s cradling side preference may have evolved to save cognitive resources during monitoring emotional baby face signals.  相似文献   

18.
Recently, investigators have challenged long‐standing assumptions that facial expressions of emotion follow specific emotion‐eliciting events and relate to other emotion‐specific responses. We address these challenges by comparing spontaneous facial expressions of anger, sadness, laughter, and smiling with concurrent, “on‐line” appraisal themes from narrative data, and by examining whether coherence between facial and appraisal components were associated with increased experience of emotion. Consistent with claims that emotion systems are loosely coupled, facial expressions of anger and sadness co‐occurred to a moderate degree with the expected appraisal themes, and when this happened, the experience of emotion was stronger. The results for the positive emotions were more complex, but lend credence to the hypothesis that laughter and smiling are distinct. Smiling co‐occurred with appraisals of pride, but never occurred with appraisals of anger. In contrast, laughter occurred more often with appraisals of anger, a finding consistent with recent evidence linking laughter to the dissociation or undoing of negative emotion.  相似文献   

19.
ABSTRACT The present study aimed to investigate changes in facial expression recognition across the lifespan, as well as to determine the influence of fluid intelligence, processing speed, and memory on this ability. Peak performance in the ability to identify facial affect was found to occur in middle-age, with the children and older adults performing the poorest. Specifically, older adults were impaired in their ability to identify fear, sadness, and happiness, but had preserved recognition of anger, disgust, and surprise. Analyses investigating the influence of cognition on emotion recognition demonstrated that cognitive abilities contribute to performance, especially for participants over age 45. However, the cognitive functions did not fully account for the older adults' impairments on expression recognition. Overall, the age-related deficits in facial expression recognition have implications for older adults' use of non-verbal communicative information.  相似文献   

20.
In the face literature, it is debated whether the identification of facial expressions requires holistic (i.e., whole face) or analytic (i.e., parts-based) information. In this study, happy and angry composite expressions were created in which the top and bottom face halves formed either an incongruent (e.g., angry top + happy bottom) or congruent composite expression (e.g., happy top + happy bottom). Participants reported the expression in the target top or bottom half of the face. In Experiment 1, the target half in the incongruent condition was identified less accurately and more slowly relative to the baseline isolated expression or neutral face conditions. In contrast, no differences were found between congruent and the baseline conditions. In Experiment 2, the effects of exposure duration were tested by presenting faces for 20, 60, 100 and 120 ms. Interference effects for the incongruent faces appeared at the earliest 20 ms interval and persisted for the 60, 100 and 120 ms intervals. In contrast, no differences were found between the congruent and baseline face conditions at any exposure interval. In Experiment 3, it was found that spatial alignment impaired the recognition of incongruent expressions, but had no effect on congruent expressions. These results are discussed in terms of holistic and analytic processing of facial expressions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号