How iconic gestures enhance communication: an ERP study |
| |
Authors: | Wu Ying Choon Coulson Seana |
| |
Affiliation: | Cognitive Science Department 0515, 9500 Gilman Drive, La Jolla, CA 92093-0515, USA. |
| |
Abstract: | EEG was recorded as adults watched short segments of spontaneous discourse in which the speaker's gestures and utterances contained complementary information. Videos were followed by one of four types of picture probes: cross-modal related probes were congruent with both speech and gestures; speech-only related probes were congruent with information in the speech, but not the gesture; and two sorts of unrelated probes were created by pairing each related probe with a different discourse prime. Event-related potentials (ERPs) elicited by picture probes were measured within the time windows of the N300 (250-350 ms post-stimulus) and N400 (350-550 ms post-stimulus). Cross-modal related probes elicited smaller N300 and N400 than speech-only related ones, indicating that pictures were easier to interpret when they corresponded with gestures. N300 and N400 effects were not due to differences in the visual complexity of each probe type, since the same cross-modal and speech-only picture probes elicited N300 and N400 with similar amplitudes when they appeared as unrelated items. These findings extend previous research on gesture comprehension by revealing how iconic co-speech gestures modulate conceptualization, enabling listeners to better represent visuo-spatial aspects of the speaker's meaning. |
| |
Keywords: | Gesture N400 N300 Semantic integration Language comprehension Object recognition Conceptual integration Embodiment ERP Meaning Simulation |
本文献已被 ScienceDirect PubMed 等数据库收录! |