首页 | 本学科首页   官方微博 | 高级检索  
     


Multisensory spatial representations in eye-centered coordinates for reaching
Authors:Pouget Alexandre  Ducom Jean Christophe  Torri Jeffrey  Bavelier Daphne
Affiliation:Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14627, USA. alex@bcs.rochester.edu
Abstract:Humans can reach for objects with their hands whether the objects are seen, heard or touched. Thus, the position of objects is recoded in a joint-centered frame of reference regardless of the sensory modality involved. Our study indicates that this frame of reference is not the only one shared across sensory modalities. The location of reaching targets is also encoded in eye-centered coordinates, whether the targets are visual, auditory, proprioceptive or imaginary. Furthermore, the remembered eye-centered location is updated after each eye and head movement. This is quite surprising since, in principle, a reaching motor command can be computed from any non-visual modality without ever recovering the eye-centered location of the stimulus. This finding may reflect the predominant role of vision in human spatial perception.
Keywords:Multisensory spatial representations   Eye-centered coordinates   Reaching
本文献已被 ScienceDirect PubMed 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号