Linguistically mediated visual search |
| |
Authors: | Spivey M J Tyler M J Eberhard K M Tanenhaus M K |
| |
Affiliation: | Department of Psychology, Cornell University,;Department of Psychology, University of Notre Dame,;Department of Brain &Cognitive Sciences, University of Rochester |
| |
Abstract: | During an individual's normal interaction with the environment and other humans, visual and linguistic signals often coincide and can be integrated very quickly. This has been clearly demonstrated in recent eyetracking studies showing that visual perception constrains on-line comprehension of spoken language. In a modified visual search task, we found the inverse, that real-time language comprehension can also constrain visual perception. In standard visual search tasks, the number of distractors in the display strongly affects search time for a target defined by a conjunction of features, but not for a target defined by a single feature. However, we found that when a conjunction target was identified by a spoken instruction presented concurrently with the visual display, the incremental processing of spoken language allowed the search process to proceed in a manner considerably less affected by the number of distractors. These results suggest that perceptual systems specialized for language and for vision interact more fluidly than previously thought. |
| |
Keywords: | |
本文献已被 PubMed 等数据库收录! |
|