首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Seeing, acting, understanding: motor resonance in language comprehension   总被引:7,自引:0,他引:7  
Observing actions and understanding sentences about actions activates corresponding motor processes in the observer-comprehender. In 5 experiments, the authors addressed 2 novel questions regarding language-based motor resonance. The 1st question asks whether visual motion that is associated with an action produces motor resonance in sentence comprehension. The 2nd question asks whether motor resonance is modulated during sentence comprehension. The authors' experiments provide an affirmative response to both questions. A rotating visual stimulus affects both actual manual rotation and the comprehension of manual rotation sentences. Motor resonance is modulated by the linguistic input and is a rather immediate and localized phenomenon. The results are discussed in the context of theories of action observation and mental simulation.  相似文献   

2.
Perception of motion affects language processing   总被引:1,自引:0,他引:1  
Recently developed accounts of language comprehension propose that sentences are understood by constructing a perceptual simulation of the events being described. These simulations involve the re-activation of patterns of brain activation that were formed during the comprehender's interaction with the world. In two experiments we explored the specificity of the processing mechanisms required to construct simulations during language comprehension. Participants listened to (and made judgments on) sentences that described motion in a particular direction (e.g. "The car approached you"). They simultaneously viewed dynamic black-and-white stimuli that produced the perception of movement in the same direction as the action specified in the sentence (i.e. towards you) or in the opposite direction as the action specified in the sentence (i.e. away from you). Responses were faster to sentences presented concurrently with a visual stimulus depicting motion in the opposite direction as the action described in the sentence. This suggests that the processing mechanisms recruited to construct simulations during language comprehension are also used during visual perception, and that these mechanisms can be quite specific.  相似文献   

3.
Embodiment theory proposes that neural systems for perception and action are also engaged during language comprehension. Previous neuroimaging and neurophysiological studies have only been able to demonstrate modulation of action systems during comprehension of concrete language. We provide neurophysiological evidence for modulation of motor system activity during the comprehension of both concrete and abstract language. In Experiment 1, when the described direction of object transfer or information transfer (e.g., away from the reader to another) matched the literal direction of a hand movement used to make a response, speed of responding was faster than when the two directions mismatched (an action-sentence compatibility effect). In Experiment 2, we used single-pulse transcranial magnetic stimulation to study changes in the corticospinal motor pathways to hand muscles while reading the same sentences. Relative to sentences that do not describe transfer, there is greater modulation of activity in the hand muscles when reading sentences describing transfer of both concrete objects and abstract information. These findings are discussed in relation to the human mirror neuron system.  相似文献   

4.
The mu rhythms (8–13 Hz) and the beta rhythms (15 up to 30 Hz) of the EEG are observed in the central electrodes (C3, Cz and C4) in resting states, and become suppressed when participants perform a manual action or when they observe another’s action. This has led researchers to consider that these rhythms are electrophysiological markers of the motor neuron activity in humans. This study tested whether the comprehension of action language, unlike abstract language, modulates mu and low beta rhythms (15–20 Hz) in a similar way as the observation of real actions. The log-ratios were calculated for each oscillatory band between each condition and baseline resting periods. The results indicated that both action language and action videos caused mu and beta suppression (negative log-ratios), whereas abstract language did not, confirming the hypothesis that understanding action language activates motor networks in the brain. In other words, the resonance of motor areas associated with action language is compatible with the embodiment approach to linguistic meaning.  相似文献   

5.
Although there is increasing evidence to suggest that language is grounded in perception and action, the relationship between language and emotion is less well understood. We investigate the grounding of language in emotion using a novel approach that examines the relationship between the comprehension of a written discourse and the performance of affect-related motor actions (hand movements towards and away from the body). Results indicate that positively and negatively valenced words presented in context influence motor responses (Experiment 1), whilst valenced words presented in isolation do not (Experiment 3). Furthermore, whether discourse context indicates that an utterance should be interpreted literally or ironically can influence motor responding, suggesting that the grounding of language in emotional states can be influenced by discourse-level factors (Experiment 2). In addition, the finding of affect-related motor responses to certain forms of ironic language, but not to non-ironic control sentences, suggests that phrasing a message ironically may influence the emotional response that is elicited.  相似文献   

6.
官群 《心理科学》2007,30(5):1252-1256
具身认知观(Embodied Cognition)试图对心-身-世界之间的交互方式给予系统的理论说明,主张:"认知是一种高度具身的、情景化的活动"(Micheal L.Anderson,2004);认知是从身体与环境的相互作用中产生的,依赖于某种类型的经验。从具身认知观来看,心理模拟是语言理解的一种手段,通过再入情景(re-situating)来实现。本文根据上述逻辑考察了词、句、语篇不同信息加工层面的实验研究进展,证实了人们在语言加工过程中感知、肌肉运动以及其他的经验印痕被激活,支持了语言理解是运动感觉以及其他相关经验的心理模拟过程。这种语言理解的心理模拟需要人们再入情景,与已有的听读说写的语言经验发生共鸣,从而为语言理解提供了新的诠释,丰富和发展了认知语言学的理论与实践。  相似文献   

7.
This study used a dual-task paradigm to analyze the time course of motor resonance during the comprehension of action language. In the study, participants read sentences describing a transfer either away from (“I threw the tennis ball to my rival”) or toward themselves (“My rival threw me the tennis ball”). When the transfer verb appeared on the screen, and after a variable stimulus onset asynchrony (SOA), a visual motion cue (Experiment 1) or a static cue (Experiment 2) prompted participants to move their hand either away from or toward themselves to press a button. The results showed meaning–action interference at short SOAs and facilitation at the longest SOA for the matching conditions. These results support the hypothesis that motor processes associated with the comprehension of action-related language interfere with an overlapping motor task, whereas they facilitate a delayed motor task. These effects are discussed in terms of resonance processes in the motor cortex.  相似文献   

8.
Whether the brain's speech-production system is also involved in speech comprehension is a topic of much debate. Research has focused on whether motor areas are involved in listening, but overlap between speaking and listening might occur not only at primary sensory and motor levels, but also at linguistic levels (where semantic, lexical, and syntactic processes occur). Using functional MRI adaptation during speech comprehension and production, we found that the brain areas involved in semantic, lexical, and syntactic processing are mostly the same for speaking and for listening. Effects of primary processing load (indicative of sensory and motor processes) overlapped in auditory cortex and left inferior frontal cortex, but not in motor cortex, where processing load affected activity only in speaking. These results indicate that the linguistic parts of the language system are used for both speaking and listening, but that the motor system does not seem to provide a crucial contribution to listening.  相似文献   

9.
10.
In a behavioral study we analyzed the influence of visual action primes on abstract action sentence processing. We thereby aimed at investigating mental motor involvement during processes of meaning constitution of action verbs in abstract contexts. In the first experiment, participants executed either congruous or incongruous movements parallel to a video prime. In the second experiment, we added a no‐movement condition. After the execution of the movement, participants rendered a sensibility judgment on action sentence targets. It was expected that congruous movements would facilitate both concrete and abstract action sentence comprehension in comparison to the incongruous and the no‐movement condition. Results in Experiment 1 showed a concreteness effect but no effect of motor priming. Experiment 2 revealed a concreteness effect as well as an interaction effect of the sentence and the movement condition. The findings indicate an involvement of motor processes in abstract action language processing on a behavioral level.  相似文献   

11.
We report a novel finding on the relation of emotion and language. Covert manipulation of emotional facial posture interacts with sentence valence when measuring the amount of time to judge valence (Experiment 1) and sensibility (Experiment 2) of the sentence. In each case, an emotion-sentence compatibility effect is found: Judgment times are faster when facial posture and sentence valence match than when they mismatch. We interpret the finding using a simulation account; that is, emotional systems contribute to language comprehension much as they do in social interaction. Because the effect was not observed on a lexical decision task using emotion-laden words (Experiment 3), we suggest that the emotion simulation affects comprehension processes beyond initial lexical access.  相似文献   

12.
We argue that psycholinguistics should be concerned with both the representation and the processing of language. Recent experimental work on syntax in language comprehension has largely concentrated on the way in which language is processed, and has assumed that theoretical linguistics serves to determine the representation of language. In contrast, we advocate experimental work on the mental representation of grammatical knowledge, and argue that sybtactic priming is a promising way to do this. Syntactic priming is the phenomenon whereby exposure to a sentence with a particular syntactic construction can affect the subsequent processing of an otherwise unrelated sentence with the same (or, perhaps, related) structure, for reasons of that structure. We assess evidence for syntactic priming in corpora, and then consider experimental evidence for priming in production and comprehension, and for bidirectional priming between comprehension and production. This in particular strongly suggests that priming is tapping into linguistic knowledge itself, and is not just facilitating particular processes. The final section discusses the importance of priming evidence for any account of language construed as the mental representation of human linguistic capacities.The order of the first two authors is arbitrary. H.B. is supported by an EPSRC Postgraduate Studentship. M.P. is supported by a British Academy Postdoctoral Felowship. S.L. is supported by a University of Nottingham Postdoctoral Fellowship. A.S. was supported by British Academy Research Grant awarded to M.P. T.U. is in part supported by a Mellon Science Development Grant. We would like to thank Dave Elmes, Tyler Lorig, Matt Traxler, an anonymous reviewer, and members of the Sentence Processing Group, Human Communication Research Centre, Universities of Edinburgh and Glasgow.  相似文献   

13.
Pelekanos V  Moutoussis K 《Perception》2011,40(12):1402-1412
Embodied cognition and perceptual symbol theories assume that higher cognition interacts with and is grounded in perception and action. Recent experiments have shown that language processing interacts with perceptual processing in various ways, indicating that linguistic representations have a strong perceptual character. In the present study, we have used signal detection theory to investigate whether the comprehension of written sentences, implying either horizontal or vertical orientation, could improve the participants' visual sensitivity for discriminating between horizontal or vertical square-wave gratings and noise. We tested this prediction by conducting one main and one follow-up experiment. Our results indicate that language can, indeed, affect perception at such a low level of the visual process and thus provide further support for the embodied theories of cognition.  相似文献   

14.
We report a new phenomenon associated with language comprehension: the action-sentence compatibility effect (ACE). Participants judged whether sentences were sensible by making a response that required moving toward or away from their bodies. When a sentence implied action in one direction (e.g., "Close the drawer" implies action away from the body), the participants had difficulty making a sensibility judgment requiring a response in the opposite direction. The ACE was demonstrated for three sentences types: imperative sentences, sentences describing the transfer of concrete objects, and sentences describing the transfer of abstract entities, such as "Liz told you the story." These dataare inconsistent with theories of language comprehension in which meaning is represented as a set of relations among nodes. Instead, the data support an embodied theory of meaning that relates the meaning of sentences to human action.  相似文献   

15.
Recent theories propose that semantic representation and sensorimotor processing have a common substrate via simulation. We tested the prediction that comprehension interacts with perception, using a standard psychophysics methodology. While passively listening to verbs that referred to upward or downward motion, and to control verbs that did not refer to motion, 20 subjects performed a motion-detection task, indicating whether or not they saw motion in visual stimuli containing threshold levels of coherent vertical motion. A signal detection analysis revealed that when verbs were directionally incongruent with the motion signal, perceptual sensitivity was impaired. Word comprehension also affected decision criteria and reaction times, but in different ways. The results are discussed with reference to existing explanations of embodied processing and the potential of psychophysical methods for assessing interactions between language and perception.  相似文献   

16.
A structural modeling approach was used to examine the relationships between age, verbal working memory (vWM), and 3 types of language measures: online syntactic processing, sentence comprehension, and text comprehension. The best-fit model for the online-processing measure revealed a direct effect of age on online sentence processing, but no effect mediated through vWM. The best-fit models for sentence and text comprehension included an effect of age mediated through vWM and no direct effect of age. These results indicate that the relationship among age, vWM, and comprehension differs depending on the measure of language processing and support the view that individual differences in vWM do not affect individuals' online syntactic processing.  相似文献   

17.
Understanding each other is a core concept of social cohesion and, consequently, has immense value in human society. Importantly, shared information leading to cohesion can come from two main sources: observed action and/or language (word) processing. In this paper, we propose a theoretical framework for the link between action observation and action verb processing. Based on the activation of common semantic representations of actions through semantic resonance, this model can account for the neurophysiological, behavioral and neuropsychological domains in the link between action observation and language. Semantic resonance is hypothesized to play a role beyond that of the mere observation of others and can benefit future studies trying to connect action production and language.  相似文献   

18.
Evidence from numerous studies using the visual world paradigm has revealed both that spoken language can rapidly guide attention in a related visual scene and that scene information can immediately influence comprehension processes. These findings motivated the coordinated interplay account ( Knoeferle & Crocker, 2006 ) of situated comprehension, which claims that utterance-mediated attention crucially underlies this closely coordinated interaction of language and scene processing. We present a recurrent sigma-pi neural network that models the rapid use of scene information, exploiting an utterance-mediated attentional mechanism that directly instantiates the CIA. The model is shown to achieve high levels of performance (both with and without scene contexts), while also exhibiting hallmark behaviors of situated comprehension, such as incremental processing, anticipation of appropriate role fillers, as well as the immediate use, and priority, of depicted event information through the coordinated use of utterance-mediated attention to the scene.  相似文献   

19.
Language development has long been associated with motor development, particularly manual gesture. We examined a variety of motor abilities – manual gesture including symbolic, meaningless and sequential memory, oral motor control, gross and fine motor control – in 129 children aged 21 months. Language abilities were assessed and cognitive and socio‐economic measures controlled for. Oral motor control was strongly associated with language production (vocabulary and sentence complexity), with some contribution from symbolic abilities. Language comprehension, however, was associated with cognitive and socio‐economic measures. We conclude that symbolic, working memory, and mirror neuron accounts of language–motor control links are limited, but that a common neural and motor substrate for nonverbal and verbal oral movements may drive the motor–language association.  相似文献   

20.
Previous reports have demonstrated that the comprehension of sentences describing motion in a particular direction (toward, away, up, or down) is affected by concurrently viewing a stimulus that depicts motion in the same or opposite direction. We report 3 experiments that extend our understanding of the relation between perception and language processing in 2 ways. First, whereas most previous studies of the relation between perception and language processing have focused on visual perception, our data show that sentence processing can be affected by the concurrent processing of auditory stimuli. Second, it is shown that the relation between the processing of auditory stimuli and the processing of sentences depends on whether the sentences are presented in the auditory or visual modality.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号