首页 | 本学科首页   官方微博 | 高级检索  
   检索      


Spatial imagery of novel places based on visual scene transformation
Authors:Naoyuki Sato
Institution:Department of Complex Systems, School of Systems Information Science, Future University Hakodate, 116-2 Kamedanakano-cho, Hakodate, Hokkaido 041-8655, Japan Lab. for Dynamics of Emergent Intelligence, RIKEN Brain Science Institute, 2-1 Hirosawa, Wako-shi, Saitama 351-0198, Japan
Abstract:The hippocampus is known to maintain memories of object-place associations that can produce a scene expectation at a novel viewpoint. To implement such capabilities, the memorized distances and directions of an object from the viewer at a fixed location should be integrated with the imaginary displacement to the new viewpoint. However, neural dynamics of such scene expectation at the novel viewpoint have not been discussed. In this study, we propose a method of coding novel places based on visual scene transformation as a component of the object-place memory in the hippocampus. In this coding, a novel place is represented by a transformed version of a viewer’s scene with imaginary displacement. When the places of individual objects are stored with the coding in the hippocampus, the object’s displacement at the imaginary viewpoint can be evaluated through the comparison of a transformed viewer’s scene with the stored scene. Results of computer experiments demonstrated that the coding successfully produced scene expectation of a three object arrangement at a novel viewpoint. Such the scene expectation was retained even without similarities between the imaginary scene and the real scene at the location, where the imaginary scenes only functioned as indices to denote the topographical relationship between object locations. The results suggest that the hippocampus uses the place coding based on scene transformation and implements the spatial imagery of object-place associations from the novel viewpoint.
Keywords:Hippocampus  Object-place associative memory  Mental navigation  Spatial cognition
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号