首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
Social phobics do not misinterpret facial expression of emotion   总被引:1,自引:0,他引:1  
Attentional biases in the processing of threatening facial expressions in social anxiety are well documented. It is generally assumed that these attentional biases originate in an evaluative bias: socially threatening information would be evaluated more negatively by socially anxious individuals. However, three studies have failed to evidence a negative evaluative bias in the processing of emotional facial expression (EFE) in socially anxious individuals. These studies however suffer from several methodological limitations that the present study has attempted to overcome. Twenty-one out-patients diagnosed with generalized social phobia have been compared to 20 out-patients diagnosed with another anxiety disorder and with 39 normal controls matched for gender, age and level of education. They had to decode on seven emotion intensity scales a set of 40 EFE whose intensity and emotional nature were manipulated. Although sufficient statistical power was ensured, no differences among groups could be found in terms of decoding accuracy, attributed emotion intensity, or reported difficulty of the task. Based on these findings as well as on other evidences, we propose that, if they exist, evaluative biases in social anxiety should be implicit and automatic and that they might be determined by the relevance of the stimulus to the person's concern rather than by the stimulus valence. The implications of these findings for the interpersonal processes involved in social phobia are discussed.  相似文献   

3.
Male college students (N = 96) were met by an experimental confederate who either agreed or disagreed with their opinion. The subjects were then given an opportunity to deliver electric shock to the confederate (victim), who responded to the shock with a facial expression of anger, fear, joy, or neutrality. The opinion condition had no effect, but the victim's facial expressions were clearly perceived by the subjects and two of them significantly influenced the amount of shock delivered to the victim by the subjects. The expression of enjoyment (smile) increased aggression while that of anger decreased aggression. The effects of the fear and neutral expressions did not differ from each other, and neither had a consistent significant effect on the amount of shock administered by the subjects.  相似文献   

4.
Adult attachment orientation has been associated with specific patterns of emotion regulation. The present research examined the effects of attachment orientation on the perceptual processing of emotional stimuli. Experimental participants played computerized movies of faces that expressed happiness, sadness, and anger. Over the course of the movies, the facial expressions became neutral. Participants reported the frame at which the initial expression no longer appeared on the face. Under conditions of no distress (Study 1), fearfully attached individuals saw the offset of both happiness and anger earlier, and preoccupied and dismissive individuals later, than the securely attached individuals. Under conditions of distress (Study 2), insecurely attached individuals perceived the offset of negative facial expressions as occurring later than did the secure individuals, and fearfully attached individuals saw the offset later than either of the other insecure groups. The mechanisms underlying the effects are considered.  相似文献   

5.
Visual-field bias in the judgment of facial expression of emotion   总被引:2,自引:0,他引:2  
The left and right hemispheres of the brain are differentially related to the processing of emotions. Although there is little doubt that the right hemisphere is relatively superior for processing negative emotions, controversy exists over the hemispheric role in the processing of positive emotions. Eighty right-handed normal male participants were examined for visual-field (left-right) differences in the perception of facial expressions of emotion. Facial composite (RR, LL) and hemifacial (R, L) sets depicting emotion expressions of happiness and sadness were prepared. Pairs of such photographs were presented bilaterally for 150 ms, and participants were asked to select the photographs that looked more expressive. A left visual-field superiority (a right-hemisphere function) was found for sad facial emotion. A hemispheric advantage in the perception of happy expression was not found.  相似文献   

6.
This study investigated whether the underlying structure of responses to facial expressions of emotion would emerge when the exposure time was increased. 25 participants judged facial photographs presented for varying durations of exposure, ranging from 4 msec. to 64 msec. in 4-msec. steps. A dual scaling method was carried out to analyze possible response differentiation as a function of exposure time. Two major components were extracted. Based on the configuration of variables they were interpreted as valence (hedonic tone) and activation. Results indicated that a positive emotion and a highly activated emotion such as surprise and fear were easily recognized under a relatively brief exposure to the stimuli.  相似文献   

7.
Facial expression is heralded as a communication system common to all human populations, and thus is generally accepted as a biologically based, universal behavior. Happiness, sadness, fear, anger, surprise, and disgust are universally recognized and produced emotions, and communication of these states is deemed essential in order to navigate the social environment. It is puzzling, however, how individuals are capable of producing similar facial expressions when facial musculature is known to vary greatly among individuals. Here, the authors show that although some facial muscles are not present in all individuals, and often exhibit great asymmetry (larger or absent on one side), the facial muscles that are essential in order to produce the universal facial expressions exhibited 100% occurrence and showed minimal gross asymmetry in 18 cadavers. This explains how universal facial expression production is achieved, implies that facial muscles have been selected for essential nonverbal communicative function, and yet also accommodate individual variation.  相似文献   

8.
Inversion interferes with the encoding of configural and holistic information more than it does with the encoding of explicitly represented and isolated parts. Accordingly, if facial expressions are explicitly represented in the face representation, their recognition should not be greatly affected by face orientation. In the present experiment, response times to detect a difference in hair color in line-drawn faces were unaffected by face orientation, but response times to detect the presence of brows and mouth were longer with inverted than with upright faces, independent of the emergent expression (neutral, happy, sad, and angry). Expressions are not explicitly represented; rather, they and the face configuration are represented as undecomposed wholes.  相似文献   

9.
The authors evaluated an intervention program developed to remediate children's deficits in reading emotions in facial expressions. Thirty children from 2 elementary schools in suburban Atlanta participated in 6 30-min sessions over 4 weeks in which they were taught to discriminate, identify, express, and apply facial expression cues. The ability to read emotion in facial expressions significantly improved for the intervention group compared with the control group. Improvement on identifying facial expressions was associated with increased feelings of lower social anxiety and higher self-worth for girls. Boys' self-concept was negatively related to improvement. On the basis of the results, the authors suggested that structured interventions like the present one could be used to improve students' nonverbal processing abilities within public school settings, but with some cautions regarding the impact of new learning for boys.  相似文献   

10.
11.
22 children (ages 4 to 6 yr.) from a day-care service were asked to "make a face" that would show how they would feel in five situations representing the basic emotions of happiness, sadness, disgust, anger, and fear. Children decoded their own videotaped responses one week later, and they also decoded the same expressions presented by a child whom they did not know. Groups of day-care teachers and university students were employed in decoding the children's facial responses. Recognizability of all emotions by all decoders improved with the age of the child in a linear manner (9% gain per year). Happy and disgusted expressions were the most easily decoded.  相似文献   

12.
Provides a comprehensive review of John T. Lanzetta's research program on facial expression and emotion. After reviewing the study that initiated this research program (Lanzetta & Kleck, 1970), the program is described as developing along four distinct lines of research: (1) the role of facial expression in the modulation and self-regulation of emotion, (2) the evocative power of the face as an emotional stimulus, (3) the role of facial expression in empathy and counterempathy, and (4) the role of facial displays in human politics. Beyond reviewing the major studies and key findings to emerge from each of these lines, the progression of thought underlying the development of this research program as a whole and the interrelations among the individual research lines are also emphasized.  相似文献   

13.
How similar are the meanings of facial expressions of emotion and the emotion terms frequently used to label them? In three studies, subjects made similarity judgments and emotion self-report ratings in response to six emotion categories represented in Ekman and Friesen's Pictures of Facial Affect, and their associated labels. Results were analyzed with respect to the constituent facial movements using the Facial Action Coding System, and using consensus analysis, multidimensional scaling, and inferential statistics. Shared interpretation of meaning was found between individuals and the group, with congruence between the meaning in facial expressions, labeling using basic emotion terms, and subjects' reported emotional responses. The data suggest that (1) the general labels used by Ekman and Friesen are appropriate but may not be optimal, (2) certain facial movements contribute more to the perception of emotion than do others, and (3) perception of emotion may be categorical rather than dimensional.  相似文献   

14.
Summary.-Images of pleasant scenes usually produce increased activity over the zygomaticus major muscie, as measured by electromyography (EMG), while less activity is elicited by unpleasant images. However, increases in zygomaticus major EMG activity while viewing unpleasant images have occasionally been reported in the literature on affective facial expression (i.e., "grimacing"). To examine the possibility that individual differences in emotion regulation might be responsible for this inconsistently observed phenomenon, the habitual emotion regulation tendencies of 63 participants (32 women) were assessed and categorized according to their regulatory tendencies. Participants viewed emotionally salient images while zygomaticus major EMG activity was recorded. Participants also provided self-report ratings of their experienced emotional valence and arousal while viewing the pictures. Despite demonstrating intact affective ratings, the "grimacing" pattern of zygomaticus major activity was observed in those who were less likely to use the cognitive reappraisal strategy to regulate their emotions.  相似文献   

15.
Anxiety and fear are often confounded in discussions of human emotions. However, studies of rodent defensive reactions under naturalistic conditions suggest anxiety is functionally distinct from fear. Unambiguous threats, such as predators, elicit flight from rodents (if an escape-route is available), whereas ambiguous threats (e.g., the odor of a predator) elicit risk assessment behavior, which is associated with anxiety as it is preferentially modulated by anti-anxiety drugs. However, without human evidence, it would be premature to assume that rodent-based psychological models are valid for humans. We tested the human validity of the risk assessment explanation for anxiety by presenting 8 volunteers with emotive scenarios and asking them to pose facial expressions. Photographs and videos of these expressions were shown to 40 participants who matched them to the scenarios and labeled each expression. Scenarios describing ambiguous threats were preferentially matched to the facial expression posed in response to the same scenario type. This expression consisted of two plausible environmental-scanning behaviors (eye darts and head swivels) and was labeled as anxiety, not fear. The facial expression elicited by unambiguous threat scenarios was labeled as fear. The emotion labels generated were then presented to another 18 participants who matched them back to photographs of the facial expressions. This back-matching of labels to faces also linked anxiety to the environmental-scanning face rather than fear face. Results therefore suggest that anxiety produces a distinct facial expression and that it has adaptive value in situations that are ambiguously threatening, supporting a functional, risk-assessing explanation for human anxiety.  相似文献   

16.
Despite the fact that facial expressions of emotion have signal value, there is surprisingly little research examining how that signal can be detected under various conditions, because most judgment studies utilize full-face, frontal views. We remedy this by obtaining judgments of frontal and profile views of the same expressions displayed by the same expressors. We predicted that recognition accuracy when viewing faces in profile would be lower than when judging the same faces from the front. Contrarily, there were no differences in recognition accuracy as a function of view, suggesting that emotions are judged equally well regardless of from what angle they are viewed.  相似文献   

17.
In most of the neuropsychological studies the left-right hemifacial asymmetry during expression of emotion has been reported, although the present author proposed a concept of hemiregional asymmetry and reported that greater left hemifacial involvement was specific to the lower region of the face and greater right hemifacial was specific to the upper region of the face. It is speculated that this differential hemifacial involvement is a function of the right hemisphere, and it is explained in terms of neural innervation to facial muscles. This speculation needs to be verified further to arrive at a general conclusion.  相似文献   

18.
19.
37 subjects' facial electromyographic activity at the corrugator and zygomatic muscle regions were recorded while they were posing with happy and sad facial expressions. Analysis showed that the mean value of EMG activity at the left zygomatic muscle region was the highest, followed by the right zygomatic, left corrugator, and right corrugator muscle regions, while a happy facial expression was posed. The mean value of EMG activity at the left corrugator muscle region was the highest, followed by those for the right corrugator, left zygomatic, and right zygomatic muscle regions while a sad facial expression was posed. Further analysis indicated that the power of facial EMG activity on the left side of the face was stronger than on the right side of the face while posing both happy and sad expressions.  相似文献   

20.
Two experiments were conducted to explore whether representational momentum (RM) emerges in the perception of dynamic facial expression and whether the velocity of change affects the size of the effect. Participants observed short morphing animations of facial expressions from neutral to one of the six basic emotions. Immediately afterward, they were asked to select the last images perceived. The results of the experiments revealed that the RM effect emerged for dynamic facial expressions of emotion: The last images of dynamic stimuli that an observer perceived were of a facial configuration showing stronger emotional intensity than the image actually presented. The more the velocity increased, the more the perceptual image of facial expression intensified. This perceptual enhancement suggests that dynamic information facilitates shape processing in facial expression, which leads to the efficient detection of other people's emotional changes from their faces.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号