Audiotactile multisensory interactions in human information processing |
| |
Authors: | NORIMICHI KITAGAWA CHARLES SPENCE |
| |
Affiliation: | NTT Communication Science Laboratories, NTT Corporation, Japan; Crossmodal Research Group, Department of Experimental Psychology, University of Oxford, UK |
| |
Abstract: | Abstract: The last few years has seen a very rapid growth of interest in how signals from different sensory modalities are integrated in the brain to form the unified percepts that fill our daily lives. Research on multisensory interactions between vision, touch, and proprioception has revealed the existence of multisensory spatial representations that code the location of external events relative to our own bodies. In this review, we highlight recent converging evidence from both human and animal studies that has revealed that spatially-modulated multisensory interactions also occur between hearing and touch, especially in the space immediately surrounding the head. These spatial audiotactile interactions for stimuli presented close to the head can affect not only the spatial aspects of perception, but also various other non-spatial aspects of audiotactile information processing. Finally, we highlight some of the most important questions for future research in this area. |
| |
Keywords: | multisensory integration crossmodal interaction hearing touch peripersonal space |
|
|