Eye movements and spoken language comprehension: effects of visual context on syntactic ambiguity resolution |
| |
Authors: | Spivey Michael J Tanenhaus Michael K Eberhard Kathleen M Sedivy Julie C |
| |
Affiliation: | Department of Psychology, Cornell University, 238 Uris Hall, Ithaca, NY 14853, USA. mjs41@cornell.edu |
| |
Abstract: | When participants follow spoken instructions to pick up and move objects in a visual workspace, their eye movements to the objects are closely time-locked to referential expressions in the instructions. Two experiments used this methodology to investigate the processing of the temporary ambiguities that arise because spoken language unfolds over time. Experiment 1 examined the processing of sentences with a temporarily ambiguous prepositional phrase (e.g., "Put the apple on the towel in the box") using visual contexts that supported either the normally preferred initial interpretation (the apple should be put on the towel) or the less-preferred interpretation (the apple is already on the towel and should be put in the box). Eye movement patterns clearly established that the initial interpretation of the ambiguous phrase was the one consistent with the context. Experiment 2 replicated these results using prerecorded digitized speech to eliminate any possibility of prosodic differences across conditions or experimenter demand. Overall, the findings are consistent with a broad theoretical framework in which real-time language comprehension immediately takes into account a rich array of relevant nonlinguistic context. |
| |
Keywords: | Spoken language comprehension Word recognition Sentence processing Syntactic ambiguity resolution Modularity Context effects Information integration |
本文献已被 ScienceDirect PubMed 等数据库收录! |