首页 | 本学科首页   官方微博 | 高级检索  
     

动态面孔和语音情绪信息的整合加工及神经生理机制
引用本文:王苹,潘治辉,张立洁,陈煦海. 动态面孔和语音情绪信息的整合加工及神经生理机制[J]. 心理科学进展, 2015, 23(7): 1109-1117. DOI: 10.3724/SP.J.1042.2015.01109
作者姓名:王苹  潘治辉  张立洁  陈煦海
作者单位:(陕西省行为与认知神经科学重点实验室, 陕西师范大学心理学院, 西安 710062)
基金项目:国家自然科学基金项目(31300835), 教育部人文社科基金(12XJC190003), 中央高校基本科研业务费(14SZYB07)资助。
摘    要:面孔、语音情绪信息的整合加工是社会交往的重要技能, 近年来逐渐引起心理学、神经科学研究的关注。当前研究较为系统地考察了双通道情绪信息整合加工的行为表现和影响因素, 也很好地回答了“何时整合”与“在哪里整合”两个认知神经科学关注的问题, 但对“面孔、语音情绪信息能否整合为一致情绪客体?双通道情绪信息在大脑中如何整合为一?”两个关键问题都还缺乏系统研究。因此, 本项目拟系统操纵面孔、语音刺激的情绪凸显度和任务要求, 引入动态面孔-语音刺激以增加外部效度, 综合运用行为和电生理技术, 从多角度挖掘数据, 特别是引入神经振荡(时频、相干)分析, 系统考察动态性面孔和语音情绪信息是否能整合成一致情绪客体, 并在神经振荡层面探明双通道情绪信息整合的机制。

关 键 词:面孔情绪  语音情绪  整合加工  神经振荡  事件相关电位  
收稿时间:2014-05-19

The Integration of Dynamic Facial and Vocal Emotion and Its Neurophysiological Mechanism
WANG Ping,PAN Zhihui,ZHANG Lijie,CHEN Xuhai. The Integration of Dynamic Facial and Vocal Emotion and Its Neurophysiological Mechanism[J]. Advances In Psychological Science, 2015, 23(7): 1109-1117. DOI: 10.3724/SP.J.1042.2015.01109
Authors:WANG Ping  PAN Zhihui  ZHANG Lijie  CHEN Xuhai
Affiliation:(Shaanxi Province Key Laboratory of Behavior and Cognitive Neuroscience, School of Psychology, Shaanxi Normal University, Xi’an 710062, China)
Abstract:The integration of facial-vocal emotion is an important factor for successful communication that intrigue psychologists and neuroscientists in recent years. Previous studies have elaborated on the behavioral performance and the influence factors for facial-vocal emotion integration, as well as “when” and “where” information from the two modes integrated. However, it remains open questions whether the integration of facial-vocal emotion follows the principles of multisensory integration (eg.the principle of inverse effectiveness), and how the bimodal emotional information merges into a coherence emotional object. Therefore, taking “whether facial-vocal emotion integration obeys the principle of inverse effectiveness” as main line, we designed six experiments which manipulated emotional salience of the dynamic facial-vocal emotional stimuli and task demands systematically. Moreover, using multi-dimensional analysis of behavioral and EEG data, especially time-frequency and coherence analysis of EEG data, we aimed to answer the two proposed questions, to further reveal the neurophysiological mechanism of facial-vocal emotion integration.
Keywords:facial emotion  vocal emotion  integration  neural oscillation  ERPs  
本文献已被 CNKI 等数据库收录!
点击此处可从《心理科学进展》浏览原始摘要信息
点击此处可从《心理科学进展》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号