首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A group of individuals conversing in natural dyads and a group of lecturers were observed for lateral hand movement patterns during speech. Right-handed individuals in both groups displayed a significant right hand bias for gesture movements but no lateral bias for self-touching movements. The study provides external validity for previous laboratory studies of lateralized hand gesture. The results were interpreted as evidence of a central processor for both spoken and gestural communication.  相似文献   

2.
殷融 《心理科学进展》2020,28(7):1141-1155
语言进化是进化心理学研究领域的重要问题。镜像系统假说、工具制造假说与传授假说从不同角度对手部动作与语言进化间的关系进行了解释, 三种假说都认为人类语言起源于手部动作经验。相关实证研究发现:手语与口语具有一致性特征、语言与手部动作具有共同的神经基础、手势发展可以预测语言发展水平以及手势可以提高工具制造知识的传播效率, 这些研究为三种假说的具体观点提供了实证支持。未来该领域的研究需要关注手势语与口语在进化中的发展关系, 以及人类语言进化与其他认知特征的进化关系。  相似文献   

3.
Studies in human subjects indicate that manual gestures accompanied by speech are produced more often by the right compared to the left hand. Additional studies indicate that the production of sign language is controlled by the same brain areas as speech, suggesting similar neurobiological substrates for language that are not modality specific. We report evidence that chimpanzees exhibit preferential use of the right hand in gestural communication. Moreover, use of the right hand in gestural communication is significantly enhanced when accompanied by a vocalization, particularly among human‐reared chimpanzees. Taken together, the data suggest that the lateralization of manual and speech systems of communication may date back as far as 5 million years ago.  相似文献   

4.
Shared tool use in our hominid ancestry is perhaps the most satisfactory explanation for human dextrality and left-hemisphere language lateralization. Recent palaeoarchaeological evidence suggests that brachiation preceded bipedalism, which in turn preceded advanced tool use, with all three preceding any dramatic increase in brain size and/or the development of speech-related neural structures. Shared tool use probably led to population dextrality, and then to the development of left-hemisphere centres for fine motor coordination and the mediation of serial, segmental, time-dependent and syntactic processes at sensory and more particularly motor levels, including the control of limbs, fingers and articulators. Such centres, initially developed for tool construction and use, would have occupied an intermediate position in the evolutionary sequence. Thus cortically-driven facial gestures may possibly have accompanied manual signing, modulating limbic vocalizations of affect, though a cortical-limbic distinction with respect to communication may be unwarranted. However, the uniqueness of human language is still a matter of debate both with respect to other primates and our own evolutionary ancestors.  相似文献   

5.
Neural correlates of bimodal speech and gesture comprehension   总被引:2,自引:0,他引:2  
The present study examined the neural correlates of speech and hand gesture comprehension in a naturalistic context. Fifteen participants watched audiovisual segments of speech and gesture while event-related potentials (ERPs) were recorded to the speech. Gesture influenced the ERPs to the speech. Specifically, there was a right-lateralized N400 effect-reflecting semantic integration-when gestures mismatched versus matched the speech. In addition, early sensory components in bilateral occipital and frontal sites differentiated speech accompanied by matching versus non-matching gestures. These results suggest that hand gestures may be integrated with speech at early and late stages of language processing.  相似文献   

6.
Tilsen S 《Cognitive Science》2009,33(5):839-879
Temporal patterns in human movement, and in speech in particular, occur on multiple timescales. Regularities in such patterns have been observed between speech gestures, which are relatively quick movements of articulators (e.g., tongue fronting and lip protrusion), and also between rhythmic units (e.g., syllables and metrical feet), which occur more slowly. Previous work has shown that patterns in both domains can be usefully modeled with oscillatory dynamical systems. To investigate how rhythmic and gestural domains interact, an experiment was conducted in which speakers performed a phrase repetition task, and gestural kinematics were recorded using electromagnetic articulometry. Variance in relative timing of gestural movements was correlated with variance in rhythmic timing, indicating that gestural and rhythmic systems interact in the process of planning and producing speech. A model of rhythmic and gestural planning oscillators with multifrequency coupling is presented, which can simulate the observed covariability between rhythmic and gestural timing.  相似文献   

7.
8.
In human face-to-face communication, language comprehension is a multi-modal, situated activity. However, little is known about how we combine information from different modalities during comprehension, and how perceived communicative intentions, often signaled through visual signals, influence this process. We explored this question by simulating a multi-party communication context in which a speaker alternated her gaze between two recipients. Participants viewed speech-only or speech + gesture object-related messages when being addressed (direct gaze) or unaddressed (gaze averted to other participant). They were then asked to choose which of two object images matched the speaker’s preceding message. Unaddressed recipients responded significantly more slowly than addressees for speech-only utterances. However, perceiving the same speech accompanied by gestures sped unaddressed recipients up to a level identical to that of addressees. That is, when unaddressed recipients’ speech processing suffers, gestures can enhance the comprehension of a speaker’s message. We discuss our findings with respect to two hypotheses attempting to account for how social eye gaze may modulate multi-modal language comprehension.  相似文献   

9.
10.
The present study investigates whether knowledge about the intentional relationship between gesture and speech influences controlled processes when integrating the two modalities at comprehension. Thirty-five adults watched short videos of gesture and speech that conveyed semantically congruous and incongruous information. In half of the videos, participants were told that the two modalities were intentionally coupled (i.e., produced by the same communicator), and in the other half, they were told that the two modalities were not intentionally coupled (i.e., produced by different communicators). When participants knew that the same communicator produced the speech and gesture, there was a larger bi-lateral frontal and central N400 effect to words that were semantically incongruous versus congruous with gesture. However, when participants knew that different communicators produced the speech and gesture--that is, when gesture and speech were not intentionally meant to go together--the N400 effect was present only in right-hemisphere frontal regions. The results demonstrate that pragmatic knowledge about the intentional relationship between gesture and speech modulates controlled neural processes during the integration of the two modalities.  相似文献   

11.
Three experiments investigated the "McGurk effect" whereby optically specified syllables experienced synchronously with acoustically specified syllables integrate in perception to determine a listener's auditory perceptual experience. Experiments contrasted the cross-modal effect of orthographic on acoustic syllables presumed to be associated in experience and memory with that of haptically experienced and acoustic syllables presumed not to be associated. The latter pairing gave rise to cross-modal influences when Ss were informed that cross-modal syllables were paired independently. Mouthed syllables affected reports of simultaneously heard syllables (and vice versa). These effects were absent when syllables were simultaneously seen (spelled) and heard. The McGurk effect does not arise from association in memory but from conjoint near specification of the same causal source in the environment--in speech, the moving vocal tract producing phonetic gestures.  相似文献   

12.
Some situations require one to quickly stop an initiated response. Recent evidence suggests that rapid stopping engages a mechanism that has diffuse effects on the motor system. For example, stopping the hand dampens the excitability of the task-irrelevant leg. However, it is unclear whether this ‘global suppression’ could apply across wider motor modalities. Here we tested whether stopping speech leads to suppression of the task-irrelevant hand. We used Transcranial Magnetic Stimulation over the primary motor cortex with concurrent electromyography from the hand. We found that when speech was successfully stopped the motor evoked potential from the task-irrelevant hand was significantly reduced compared to when the participant failed to stop speaking, or responded on non stop signal trials, or compared to baseline. This shows that when speech is quickly stopped, there is a broad suppression across the motor system. This has implications for the neural basis of speech control and stuttering.  相似文献   

13.
This paper summarizes the various physical science elements of the evolutionary story. The “nonlinear” revolution includes chaos, self‐organization theory, the thermodynamics of evolution, and biological evolution. The key result of the revolution is a physical understanding of how order emerges and change is driven. This paper brings the lessons of each branch of the revolution together into a single easily understood thread. The unexpected result of this revolution is an understanding of evolution as a single inexorable physical process that has given rise to everything from molecules to humankind. Here evolution is driven by energy flow and involves increasing efficiency, the growth of complexity, and the acceleration of change. This paper ties the resulting physical insights to an evolving ecological world view, looks at its ties to spirituality and applies the rules that govern the growth of complexity to business and its current crisis. The net result of the nonlinear revolution is a radical change in our everyday understanding of how the world works, a shift away from classical science's machine world to the vision of an evolving deeply ecologically intertwined world.  相似文献   

14.
The way adults express manner and path components of a motion event varies across typologically different languages both in speech and cospeech gestures, showing that language specificity in event encoding influences gesture. The authors tracked when and how this multimodal cross-linguistic variation develops in children learning Turkish and English, 2 typologically distinct languages. They found that children learn to speak in language-specific ways from age 3 onward (i.e., English speakers used 1 clause and Turkish speakers used 2 clauses to express manner and path). In contrast, English- and Turkish-speaking children's gestures looked similar at ages 3 and 5 (i.e., separate gestures for manner and path), differing from each other only at age 9 and in adulthood (i.e., English speakers used 1 gesture, but Turkish speakers used separate gestures for manner and path). The authors argue that this pattern of the development of cospeech gestures reflects a gradual shift to language-specific representations during speaking and shows that looking at speech alone may not be sufficient to understand the full process of language acquisition.  相似文献   

15.
In the last two decades the integrative role of the frontal premotor cortex (a mosaic of agranular/disgranular areas lying in front of the primary motor cortex) have been more and more elucidated. Among its various functions, sensorimotor transformation, and action representation storage, also for nonstrictly motor purposes, are the most intriguing properties of this region, as shown by several researches. In this article we will mainly focus on the ventro-rostral part of the monkey premotor cortex (area F5) in which visual information describing objects and others' acting hands are associated with goal-directed motor representations of hand movements. We will describe the main characteristics of F5 premotor neurons and we will provide evidence in favor of a parallelism between monkeys and humans on the basis of new experimental observations. Finally, we will present some data indicating that, both in humans and in monkeys, action-related sensorimotor transformations are not restricted to visual information but concern also acoustic information.  相似文献   

16.
17.
18.
19.
Among right-handers, the magnitude of differences in proficiency between the left and right hands varies considerably. Yet significance of the extent of right-handedness is still a controversial issue. To examine whether individual differences in asymmetry of hand skill can partly be attributed to individual differences in asymmetrical hemispheric activation, handedness and electroencephalographic (EEG) laterality were correlated in two large samples (ns = 60 and 128). Analysis indicated that part of the variability in right-handedness may arise from activation asymmetries in the cortex, but whether this relation becomes apparent depends on the cortical area examined and on the experimental condition under which the EEG measures are taken.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号