首页 | 本学科首页   官方微博 | 高级检索  
     


See an object,hear an object file: Object correspondence transcends sensory modality
Authors:Kerry E. Jordan  Kait Clark  Stephen R. Mitroff
Affiliation:1. Department of Psychology , Utah State University , Logan, UT, USA kerry.jordan@usu.edu;3. Center for Cognitive Neuroscience , Duke University , Durham, NC, USA;4. Center for Cognitive Neuroscience, and Department of Psychology &5. Neuroscience , Duke University , Durham, NC, USA
Abstract:An important task of perceptual processing is to parse incoming information into distinct units and to keep track of those units over time as the same, persisting representations. Within the study of visual perception, maintaining such persisting object representations is helped by “object files”—episodic representations that store (and update) information about objects' properties and track objects over time and motion via spatiotemporal information. Although object files are typically discussed as visual, here we demonstrate that object–file correspondence can be computed across sensory modalities. An object file can be initially formed with visual input and later accessed with corresponding auditory information, suggesting that object files may be able to operate at a multimodal level of perceptual processing.
Keywords:Auditory  Cognition  Multisensory  Object file  Visual
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号