首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Observing touch on another person's body activates brain regions involved in tactile perception, even when the observer's body is not directly stimulated. Previous work has shown that in some synaesthetes, this effect induces a sensation of being touched. The present study shows that if perceptual thresholds are experimentally manipulated, viewing touch can modulate tactile experience in nonsynaesthetes as well. When observers saw a face being touched by hands, rather than a face being merely approached by hands, they demonstrated enhanced detection of subthreshold tactile stimuli on their own faces. This effect was specific to observing touch on a body part, and was not found for touch on a nonbodily stimulus, namely, a picture of a house. In addition, the effect was stronger when subjects viewed their own faces rather than another person's face. Thus, observing touch can activate the tactile system, and if perceptual thresholds are manipulated, such activation can result in a behavioral effect in nonsynaesthetes. The effect is maximum if the observed body matches the observer's body.  相似文献   

2.
Two experiments investigated 18-month-olds' understanding of the link between visual perception and emotion. Infants watched an adult perform actions on objects. An emoter then expressed neutral affect or anger toward the adult in response to the adult's actions. Subsequently, infants were given 20 s to interact with each object. In Experiment 1, the emoter faced infants with a neutral expression during each 20-s response period but looked at either a magazine or the infant. In Experiment 2, the emoter faced infants with a neutral expression, and her eyes were either open or closed. When the emoter visually monitored infants' actions, the infants regulated their object-directed behavior on the basis of their memory of the emoter's affect. However, if the previously angry emoter read a magazine (Experiment 1) or closed her eyes (Experiment 2), infants were not governed by her prior emotion. Infants behaved as if they expected the emoter to get angry only if she could see them performing the actions. These findings suggest that infants appreciate how people's visual experiences influence their emotions and use this information to regulate their own behavior.  相似文献   

3.
Gaze perception is an important social skill, as it portrays information about what another person is attending to. Gaze direction has been shown to affect interpretation of emotional expression. Here the authors investigate whether the emotional facial expression has a reciprocal influence on interpretation of gaze direction. In a forced-choice yes-no task, participants were asked to judge whether three faces expressing different emotions (anger, fear, happiness, and neutral) in different viewing angles were looking at them or not. Happy faces were more likely to be judged as looking at the observer than were angry, fearful, or neutral faces. Angry faces were more often judged as looking at the observer than were fearful and neutral expressions. These findings are discussed on the background of approach and avoidance orientation of emotions and of the self-referential positivity bias.  相似文献   

4.
We investigated whether emotional information from facial expression and hand movement quality was integrated when identifying the expression of a compound stimulus showing a static facial expression combined with emotionally expressive dynamic manual actions. The emotions (happiness, neutrality, and anger) expressed by the face and hands were either congruent or incongruent. In Experiment 1, the participants judged whether the stimulus person was happy, neutral, or angry. Judgments were mainly based on the facial expressions, but were affected by manual expressions to some extent. In Experiment 2, the participants were instructed to base their judgment on the facial expression only. An effect of hand movement expressive quality was observed for happy facial expressions. The results conform with the proposal that perception of facial expressions of emotions can be affected by the expressive qualities of hand movements.  相似文献   

5.
Facial expressions serve as cues that encourage viewers to learn about their immediate environment. In studies assessing the influence of emotional cues on behavior, fearful and angry faces are often combined into one category, such as "threat-related," because they share similar emotional valence and arousal properties. However, these expressions convey different information to the viewer. Fearful faces indicate the increased probability of a threat, whereas angry expressions embody a certain and direct threat. This conceptualization predicts that a fearful face should facilitate processing of the environment to gather information to disambiguate the threat. Here, we tested whether fearful faces facilitated processing of neutral information presented in close temporal proximity to the faces. In Experiment 1, we demonstrated that, compared with neutral faces, fearful faces enhanced memory for neutral words presented in the experimental context, whereas angry faces did not. In Experiment 2, we directly compared the effects of fearful and angry faces on subsequent memory for emotional faces versus neutral words. We replicated the findings of Experiment 1 and extended them by showing that participants remembered more faces from the angry face condition relative to the fear condition, consistent with the notion that anger differs from fear in that it directs attention toward the angry individual. Because these effects cannot be attributed to differences in arousal or valence processing, we suggest they are best understood in terms of differences in the predictive information conveyed by fearful and angry facial expressions.  相似文献   

6.
为了考察拥挤感启动对威胁性面部表情识别的影响,以28名大学生为被试,进行不同拥挤启动条件下的愤怒-中性和恐惧-中性表情识别任务。信号检测论分析发现,拥挤感启动降低了愤怒表情识别的辨别力,不影响其判断标准,也不影响恐惧表情识别的辨别力和判断标准;主观报告的愤怒表情强度在拥挤感启动条件下显著高于非拥挤条件,恐惧、中性表情强度则不受拥挤感启动的影响。结果表明,拥挤感启动使人们辨别愤怒表情的知觉敏感性下降。  相似文献   

7.
To evaluate whether there is an early attentional bias towards negative stimuli, we tracked participants' eyes while they passively viewed displays composed of four Ekman faces. In Experiment 1 each display consisted of three neutral faces and one face depicting fear or happiness. In half of the trials, all faces were inverted. Although the passive viewing task should have been very sensitive to attentional biases, we found no evidence that overt attention was biased towards fearful faces. Instead, people tended to actively avoid looking at the fearful face. This avoidance was evident very early in scene viewing, suggesting that the threat associated with the faces was evaluated rapidly. Experiment 2 replicated this effect and extended it to angry faces. In sum, our data suggest that negative facial expressions are rapidly analysed and influence visual scanning, but, rather than attract attention, such faces are actively avoided.  相似文献   

8.
This paper reports three studies in which stronger orienting to perceived eye gaze direction was revealed when observers viewed faces showing fearful or angry, compared with happy or neutral, emotional expressions. Gaze-related spatial cueing effects to laterally presented fearful faces and centrally presented angry faces were also modulated by the anxiety level of participants, with high- but not low-state anxious individuals revealing enhanced shifts of attention. In contrast, both high- and low-state anxious individuals demonstrated enhanced orienting to averted gaze when viewing laterally presented angry faces. These results provide novel evidence for the rapid integration of facial expression and gaze direction information, and for the regulation of gaze-cued attention by both the emotion conveyed in the perceived face and the degree of anxiety experienced by the observer.  相似文献   

9.
Classification of faces as to their sex or their expression—with sex and expression varying orthogonally—was studied in three experiments. In Experiment 1, expression classification was influenced by sex, with angry male faces being classified faster than angry female faces. Complementarily, sex classification was faster for happy than for angry female faces. In Experiment 2, mutual interaction of sex and expression was also found when the participants were asked to classify top and bottom face segments. In Experiment 3, a face inversion effect was found for both sex and expression classification of whole faces. However, a symmetrical interaction between sex and expression was again found. The results are discussed in terms of configural versus feature processing in the perception of face sex and expression and of their relevance to face perception models that postulate independent processing of different facial features. 2009 The Psychonomic Society, Inc.  相似文献   

10.
Previous studies of tactile spatial perception focussed either on a single point of stimulation, on local patterns within a single skin region such as the fingertip, on tactile motion, or on active touch. It remains unclear whether we should speak of a tactile field, analogous to the visual field, and supporting spatial relations between stimulus locations. Here we investigate this question by studying perception of large-scale tactile spatial patterns on the hand, arm and back. Experiment 1 investigated the relation between perception of tactile patterns and the identification of subsets of those patterns. The results suggest that perception of tactile spatial patterns is based on representing the spatial relations between locations of individual stimuli. Experiment 2 investigated the spatial and temporal organising principles underlying these relations. Experiment 3 showed that tactile pattern perception makes reference to structural representations of the body, such as body parts separated by joints. Experiment 4 found that precision of pattern perception is poorer for tactile patterns that extend across the midline, compared to unilateral patterns. Overall, the results suggest that the human sense of touch involves a tactile field, analogous to the visual field. The tactile field supports computation of spatial relations between individual stimulus locations, and thus underlies tactile pattern perception.  相似文献   

11.
The purpose of the present investigation was to assess whether interpersonal closeness facilitates earlier emotion detection as the emotional expression unfolds. Female undergraduate participants were either paired with a close friend or an acquaintance (n = 92 pairs). Participants viewed morphed movies of their partner and a stranger gradually shifting from a neutral to either a sad, angry, or happy expression. As predicted, findings indicate a closeness advantage. Close friends detected the onset of their partners’ angry and sad expressions earlier than acquaintances. Additionally, close friends were more accurate than acquaintances in identifying angry and sad expressions at the onset, particularly in non-vignette conditions when these expressions were void of context. These findings suggest that closeness does indeed facilitate emotional perception, particularly in ambiguous situations for negative emotions.  相似文献   

12.
This study addressed the relative reliance on face and body configurations for different types of emotion-related judgements: emotional state and motion intention. Participants viewed images of people with either emotionally congruent (both angry or fearful) or incongruent (angry/fearful; fearful/angry) faces and bodies. Congruent conditions provided baseline responses. Incongruent conditions revealed relative reliance on face and body information for different judgements. Body configurations influenced motion-intention judgements more than facial configurations: incongruent pairs with angry bodies were more frequently perceived as moving forward than those with fearful bodies; pairs with fearful bodies were more frequently perceived as moving away. In contrast, faces influenced emotional-state judgements more, but bodies moderated ratings of face emotion. Thus, both face and body configurations influence emotion perception, but the type of evaluation required influences their relative contributions. These findings highlight the importance of considering both the face and body as important sources of emotion information.  相似文献   

13.
This study addressed the relative reliance on face and body configurations for different types of emotion-related judgements: emotional state and motion intention. Participants viewed images of people with either emotionally congruent (both angry or fearful) or incongruent (angry/fearful; fearful/angry) faces and bodies. Congruent conditions provided baseline responses. Incongruent conditions revealed relative reliance on face and body information for different judgements. Body configurations influenced motion-intention judgements more than facial configurations: incongruent pairs with angry bodies were more frequently perceived as moving forward than those with fearful bodies; pairs with fearful bodies were more frequently perceived as moving away. In contrast, faces influenced emotional-state judgements more, but bodies moderated ratings of face emotion. Thus, both face and body configurations influence emotion perception, but the type of evaluation required influences their relative contributions. These findings highlight the importance of considering both the face and body as important sources of emotion information.  相似文献   

14.
In the present study we examined the neural correlates of facial emotion processing in the first year of life using ERP measures and cortical source analysis. EEG data were collected cross‐sectionally from 5‐ (N = 49), 7‐ (N = 50), and 12‐month‐old (N = 51) infants while they were viewing images of angry, fearful, and happy faces. The N290 component was found to be larger in amplitude in response to fearful and happy than angry faces in all posterior clusters and showed largest response to fear than the other two emotions only over the right occipital area. The P400 and Nc components were found to be larger in amplitude in response to angry than happy and fearful faces over central and frontal scalp. Cortical source analysis of the N290 component revealed greater cortical activation in the right fusiform face area in response to fearful faces. This effect started to emerge at 5 months and became well established at 7 months, but it disappeared at 12 months. The P400 and Nc components were primarily localized to the PCC/Precuneus where heightened responses to angry faces were observed. The current results suggest the detection of a fearful face in infants’ brain can happen shortly (~200–290 ms) after the stimulus onset, and this process may rely on the face network and develop substantially between 5 to 7 months of age. The current findings also suggest the differential processing of angry faces occurred later in the P400/Nc time window, which recruits the PCC/Precuneus and is associated with the allocation of infants’ attention.  相似文献   

15.
Touch communicates distinct emotions   总被引:1,自引:0,他引:1  
The study of emotional signaling has focused almost exclusively on the face and voice. In 2 studies, the authors investigated whether people can identify emotions from the experience of being touched by a stranger on the arm (without seeing the touch). In the 3rd study, they investigated whether observers can identify emotions from watching someone being touched on the arm. Two kinds of evidence suggest that humans can communicate numerous emotions with touch. First, participants in the United States (Study 1) and Spain (Study 2) could decode anger, fear, disgust, love, gratitude, and sympathy via touch at much-better-than-chance levels. Second, fine-grained coding documented specific touch behaviors associated with different emotions. In Study 3, the authors provide evidence that participants can accurately decode distinct emotions by merely watching others communicate via touch. The findings are discussed in terms of their contributions to affective science and the evolution of altruism and cooperation.  相似文献   

16.
Recent research suggests that eye-gaze direction modulates perceived emotional expression. Here we explore the extent to which emotion affects interpretation of attention direction. We captured three-dimensional face models of 8 actors expressing happy, fearful, angry and neutral emotions. From these 3D models 9 views were extracted (0°, 2°, 4°, 6°, 8° to the left and right). These stimuli were randomly presented for 150 ms. Using a forced-choice paradigm 28 participants judged for each face whether or not it was attending to them. Two conditions were tested: either the whole face was visible, or the eyes were covered. In both conditions happy faces elicited most "attending-to-me" answers. Thus, emotional expression has a more general effect than an influence on gaze direction: emotion affects interpretation of attention direction. We interpret these results as a self-referential positivity bias, suggesting a general preference to associate a happy face with the self.  相似文献   

17.
Emotion researchers often categorize angry and fearful face stimuli as "negative" or "threatening". Perception of fear and anger, however, appears to be mediated by dissociable neural circuitries and often elicit distinguishable behavioral responses. The authors sought to elucidate whether viewing anger and fear expressions produce dissociable psychophysiological responses (i.e., the startle reflex). The results of two experiments using different facial stimulus sets (representing anger, fear, neutral, and happy) indicated that viewing anger was associated with a significantly heightened startle response (p < .05) relative to viewing fear, happy, and neutral. This finding suggests that while anger and fear faces convey messages of "threat", their priming effect on startle circuitry differs. Thus, angry expressions, representing viewer-directed threat with an unambiguous source (i.e., the expresser), may more effectively induce a motivational propensity to withdraw or escape. The source of threat is comparatively less clear for fearful faces. The differential effects of these two facial threat signals on the defensive motivational system adds to growing literature highlighting the importance of distinguishing between emotional stimuli of similar valence, along lines of meaning and functional impact.  相似文献   

18.
采用视觉追踪技术,探讨自闭症谱系障碍(autism spectrum disorders, ASD)儿童面孔加工的同龄偏向效应。实验1选取19名ASD儿童和23名生理年龄匹配的普通儿童(typically developing, TD)为被试,通过自由观看同龄和异龄中性面孔,探讨ASD儿童是否存在面孔加工的同龄偏向效应;实验2选取22名ASD儿童和生理年龄匹配的25名TD儿童,通过自由观看快乐、愤怒和恐惧面孔,探讨情绪对ASD儿童面孔加工同龄偏向的影响。结果发现,(1)ASD儿童对同龄中性面孔的注视时间显著大于异龄中性面孔;(2)ASD儿童在愤怒和恐惧情绪下对同龄面孔的注视时间显著大于异龄面孔,而在快乐情绪下同龄和异龄的注视时间则无显著差异。这表明ASD儿童对面孔的注视加工存在同龄偏向效应,且受情绪的影响。  相似文献   

19.
In 6 experiments, the authors investigated whether attention orienting by gaze direction is modulated by the emotional expression (neutral, happy, angry, or fearful) on the face. The results showed a clear spatial cuing effect by gaze direction but no effect by facial expression. In addition, it was shown that the cuing effect was stronger with schematic faces than with real faces, that gaze cuing could be achieved at very short stimulus onset asynchronies (14 ms), and that there was no evidence for a difference in the strength of cuing triggered by static gaze cues and by cues involving apparent motion of the pupils. In sum, the results suggest that in normal, healthy adults, eye direction processing for attention shifts is independent of facial expression analysis.  相似文献   

20.
Attending versus ignoring a stimulus can later determine how it will be affectively evaluated. Here, we asked whether attentional states could also modulate subsequent sensitivity to facial expressions of emotion. In a dual-task procedure, participants first rapidly searched for a gender-defined face among two briefly displayed neutral faces. Then a test face with the previously attended or ignored face’s identity was presented, and participants judged whether it was emotionally expressive (happy, angry, or fearful) or neutral. Intensity of expression in the test face was varied so that an expression detection threshold could be determined. When fearful or angry expressions were judged, expression sensitivity was worse for faces bearing the same identity as a previously ignored versus attended face. When happy expressions were judged, sensitivity was unaffected by prior attention. These data support the notion that the motivational value of stimuli may be reduced by processes associated with selective ignoring.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号