首页 | 本学科首页   官方微博 | 高级检索  
   检索      


Intermodal event files: integrating features across vision,audition, taction,and action
Authors:Sharon Zmigrod  Michiel Spapé  Bernhard Hommel
Institution:(1) Department of Psychology, Cognitive Psychology Unit, Leiden University Institute for Psychological Research and Leiden Institute for Brain and Cognition, Postbus 9555, 2300 RB Leiden, The Netherlands
Abstract:Understanding how the human brain integrates features of perceived events calls for the examination of binding processes within and across different modalities and domains. Recent studies of feature-repetition effects have demonstrated interactions between shape, color, and location in the visual modality and between pitch, loudness, and location in the auditory modality: repeating one feature is beneficial if other features are also repeated, but detrimental if not. These partial-repetition costs suggest that co-occurring features are spontaneously bound into temporary event files. Here, we investigated whether these observations can be extended to features from different sensory modalities, combining visual and auditory features in Experiment 1 and auditory and tactile features in Experiment 2. The same types of interactions, as for unimodal feature combinations, were obtained including interactions between stimulus and response features. However, the size of the interactions varied with the particular combination of features, suggesting that the salience of features and the temporal overlap between feature-code activations plays a mediating role.
Keywords:
本文献已被 PubMed SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号