首页 | 本学科首页   官方微博 | 高级检索  
   检索      

高共情者加工情绪刺激的注意特征及其眼动证据
引用本文:孙俊才,刘萍,李丹.高共情者加工情绪刺激的注意特征及其眼动证据[J].心理科学,2018,0(5):1084-1089.
作者姓名:孙俊才  刘萍  李丹
作者单位:1. 曲阜师范大学;2. 上海师范大学;
摘    要:共情是指个体通过观察、想象或推断他人的情感而产生与之同形的情感体验。本研究采用点探测范式并结合眼动追踪技术,以词语和面孔表情为实验材料,综合探讨了高低共情被试对不同类型刺激材料的注意偏向及具体成分的时间进程特点。结果表明,虽然高低共情被试都对负性刺激(特别是对悲伤面孔)表现出早期注意定向(首次注视到达时间更快),但只有高共情被试对悲伤面孔的晚期注意维持更长(总注视时间更长)。本研究表明,面孔表情是区分共情特质注意偏向更有效的实验材料;高共情被试对悲伤面孔表情存在注意偏向,这为理解人际间的心灵感知提供了重要的实证依据。

关 键 词:共情  情绪加工  注意特征  点探测  眼动  
收稿时间:2017-12-09
修稿时间:2018-07-01

Attentional Characteristics of High-Empathy People in Processing Emotional Stimuli and Evidence from Their Eye Movements
Abstract:Empathy refers to the ability of an individual to understand and feel someone else’s affect ,which is vital for individual development and interpersonal harmony.Mind perception and mind reading are the two trends in the study of intersubjectivity in psychology.Moreover,empathy is an important embodiment of the perception of the mind.In order to understand and resonate to others' emotion,empathy individual mainly depends on two inputs:situational understanding system and emotional clues classification system. One can perceive other people’s inner feelings through others’ emotional clues (such as facial expression),and also can make inferences based on situations (such as emotional words)when such intuitive emotional clues are lacking.Although the studies of effect of empathy on emotional information processing generally use facial expressions as experiment stimuli,a few researches demonstrate that there are some different neural mechanisms in cognitive processing of faces and words.To our knowledge,however,it is not clear whether the attentional bias of words and facial expressions is consistent or not for high-empathy people.Does high-empathy individual tend to be more inclined to understand the emotions of others instead of just the classification and representation of emotional clues? Methods:This research combine the dot-probe paradigm and eye movement technology to investigate the specific components of attentional bias toward words and faces.55 subjects were screened and selected for the high-and low-empathy groups(29 people in the high-empathy group) based on the total scores obtained on the Interpersonal Reactivity Index (IRI) questionnaire.The experiment was a 2(subject type:high-empathy,low-empathy)×2(experimental material:words,faces)×3(paired condition:negative-consistent,negative-inconsistent,neutral-neutral) mixed design.The experiment stimuli of words and faces was selected from Chinese Affective Words System and Chinese Face Affective Picture System respectively.According to previous studies,initial orienting of attention was measured as the time to first fixation;attention maintenance was measured by total fixation duration.Besides,subjects’ reaction time for judging the position of the probing point were recorded. Results:In the attention orienting phase,both groups of subjects had faster early-stage attention orienting toward emotional stimuli than toward neutral stimuli(a shorter time to first fixation),particularly toward sad faces;and high-empathy subjects directed their attention more quickly to the faces than the words.In the attention maintenance phase,the total fixation time of sad faces was longer than the total fixation time of negative words;an Independent-Samples T Test showed that high-empathy subjects had longer late-stage attention maintenance toward sad faces (a longer total fixation duration)than low-empathy subjects.There was no difference under different experiment paired conditions in reaction time. Conclusion:This study demonstrates that facial expressions is a more index than words and the high-empathy subjects show processing advantage in the attention orienting-maintenance modes toward sad faces,which provide an important empirical basis for interpersonal mind perception.
Keywords:empathy  emotion processing attentional  characteristics  dot-probe  eye movement  
本文献已被 CNKI 等数据库收录!
点击此处可从《心理科学》浏览原始摘要信息
点击此处可从《心理科学》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号