Abstract: | Multisensory integration of nonspatial features between vision and touch was investigated by examining the effects of redundant signals of visual and tactile inputs. In the present experiments, visual letter stimuli and/or tactile letter stimuli were presented, which participants were asked to identify as quickly as possible. The results of Experiment 1 demonstrated faster reaction times for bimodal stimuli than for unimodal stimuli (the redundant signals effect (RSE)). The RSE was due to coactivation of figural representations from the visual and tactile modalities. This coactivation did not occur for a simple stimulus detection task (Experiment 2) or for bimodal stimuli with the same semantic information but different physical stimulus features (Experiment 3). The findings suggest that the integration process might occur at a relatively early stage of object-identification prior to the decision level. |