首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This study looks at whether there is a relationship between mother and infant gesture production. Specifically, it addresses the extent of articulation in the maternal gesture repertoire and how closely it supports the infant production of gestures. Eight Spanish mothers and their 1‐ and 2‐year‐old babies were studied during 1 year of observations. Maternal and child verbal production, gestures and actions were recorded at their homes on five occasions while performing daily routines. Results indicated that mother and child deictic gestures (pointing and instrumental) and representational gestures (symbolic and social) were very similar at each age group and did not decline across groups. Overall, deictic gestures were more frequent than representational gestures. Maternal adaptation to developmental changes is specific for gesturing but not for acting. Maternal and child speech were related positively to mother and child pointing and representational gestures, and negatively to mother and child instrumental gestures. Mother and child instrumental gestures were positively related to action production, after maternal and child speech was partialled out. Thus, language plays an important role for dyadic communicative activities (gesture–gesture relations) but not for dyadic motor activities (gesture–action relations). Finally, a comparison of the growth curves across sessions showed a closer correspondence for mother–child deictic gestures than for representational gestures. Overall, the results point to the existence of an articulated maternal gesture input that closely supports the child gesture production. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

2.
《Cognitive development》1997,12(2):185-197
Language development is thought to be associated with other domains of mental activity that require representational capacity, like symbolic gesture. Study 1 explores associations between multiple measures of language and gesture across the third and fourth years of life. Results indicate stability in language, partial stability in gesture, and concurrent and longitudinal associations between language and gesture. Study 2 further explores language-symbolic gesture relations by measuring more finely which aspects of language relate to the symbolic representation of actions with objects and by exploring associations between symbolic gesture and a performance (nonverbal) measure of general intellectual ability.  相似文献   

3.
Sighted speakers of different languages vary systematically in how they package and order components of a motion event in speech. These differences influence how semantic elements are organized in gesture, but only when those gestures are produced with speech (co‐speech gesture), not without speech (silent gesture). We ask whether the cross‐linguistic similarity in silent gesture is driven by the visuospatial structure of the event. We compared 40 congenitally blind adult native speakers of English or Turkish (20/language) to 80 sighted adult speakers (40/language; half with, half without blindfolds) as they described three‐dimensional motion scenes. We found an effect of language on co‐speech gesture, not on silent gesture—blind speakers of both languages organized their silent gestures as sighted speakers do. Humans may have a natural semantic organization that they impose on events when conveying them in gesture without language—an organization that relies on neither visuospatial cues nor language structure.  相似文献   

4.
Arm movements can influence language comprehension much as semantics can influence arm movement planning. Arm movement itself can be used as a linguistic signal. We reviewed neurophysiological and behavioural evidence that manual gestures and vocal language share the same control system. Studies of primate premotor cortex and, in particular, of the so-called "mirror system", including humans, suggest the existence of a dual hand/mouth motor command system involved in ingestion activities. This may be the platform on which a combined manual and vocal communication system was constructed. In humans, speech is typically accompanied by manual gesture, speech production itself is influenced by executing or observing transitive hand actions, and manual actions play an important role in the development of speech, from the babbling stage onwards. Behavioural data also show reciprocal influence between word and symbolic gestures. Neuroimaging and repetitive transcranial magnetic stimulation (rTMS) data suggest that the system governing both speech and gesture is located in Broca's area. In general, the presented data support the hypothesis that the hand motor-control system is involved in higher order cognition.  相似文献   

5.
手势是语言交流过程中的一种重要的非语言媒介, 其不仅与语言互动间的关系密切, 而且具有不同的交流认知特征。文章重点归纳和述评了手势和语言交流的关系, 手势相对独立的交流特征, 教育情境中的手势交流。文章具体提出:首先, 手势和语言的共同表达促进了语言的发生和语言的理解、整合和记忆; 其次, 手势一定程度上具有独立的交流性, 手势和语言的“不匹配性”反映了交流信息的变化和交流认知的改变; 最后, 教育情境中教师的手势表达可以引导学生的注意并澄清语言信息, 学生的手势交流有助于促进学习认知过程。未来研究需要进一步探讨手势对于语言交流功能的影响, 语言交流过程中手势交流的优势特征和认知机制, 教育情境中手势交流高效性的认知机制, 手势交流的影响因素、一般特征和个体差异。  相似文献   

6.
Gesture Reflects Language Development: Evidence From Bilingual Children   总被引:1,自引:0,他引:1  
There is a growing awareness that language and gesture are deeply intertwined in the spontaneous expression of adults. Although some research suggests that children use gesture independently of speech, there is scant research on how language and gesture develop in children older than 2 years. We report here on a longitudinal investigation of the relation between gesture and language development in French-English bilingual children from 2 to 3 1/2 years old. The specific gesture types of iconics and beats correlated with the development of the children's two languages, whereas pointing types of gestures generally did not. The onset of iconic and beat gestures coincided with the onset of sentencelike utterances separately in each of the children's two languages. The findings show that gesture is related to language development rather than being independent from it. Contrasting theories about how gesture is related to language development are discussed.  相似文献   

7.
语言伴随性手势是人类语言交流的一个普遍的特征, 它可以发挥信息交流的功能。依据产生目的以及适用范围的不同, 手势可以分成表象性手势和非表象性手势。多数研究者认为, 语言和手势动作是“近亲”, 具有“家族相似性”。来自语言发展、认知心理学和认知神经科学的证据均表明, 手势和语言共享同一个交流系统。当手势动作和发音单词意义相同时, 手势动作受到单词发音的干扰, 同时引发发音共振峰2 (F2)的放大。手势和言语之间遵循互相作用理论, 它们作用的基础是语义一致性, 镜像神经系统完成了两者语义一致性的传递。由于语言伴随性手势和语言之间的关系是语言和行为结合良好的特例, 因此, 对语言伴随性手势的研究将有助于对人类心智有一个更深层次的认识和理解。  相似文献   

8.
This paper addresses the issue of the separability of disorders of sign language from disorders of gesture and pantomime. The study of a left-lesioned deaf signer presents one of the most striking examples to date of the cleavage between linguistic signs and manual pantomime. The left-hemisphere lesion produced a marked sign language aphasia disrupting both the production and the comprehension of sign language. However, in sharp contrast to the breakdown of sign language, the ability to communicate in nonlinguistic gesture was remarkably spared. This case has important implications for our understanding of the neural mediation of language and gesture. We argue that the differences observed in the fractionation of linguistic versus nonlinguistic gesture reflect differing degrees of compositionality of systems underlying language and gesture. The compositionality hypothesis receives support for the existence of phonemic paraphasias in sign language production, illustrating structural dissolution which is absent in the production of pantomimic gesture. Understanding the neural encoding of compositional motoric systems may lead to a principled anatomical account of the neural separability of language and gesture. This case provides a powerful indication of the left hemisphere's specialization for language-specific functions.  相似文献   

9.
PurposeThe aim of this study was to examine the relationship between frequency of gesture use and language with a consideration for the effect of age and setting on frequency of gesture use in prelinguistic typically developing children.MethodParticipants included a total of 54 typically developing infants and toddlers between the ages of 9 months and 15 months separated into two age ranges, 9-12 months and 12-15 months. All participants were administered the Mullen’s Scale of Early Learning and two gesture samples were obtained: one sample in a structured setting and the other in an unstructured setting. Gesture samples were coded by research assistants blind to the purpose of the research study and total frequency and frequencies for the following gesture types were calculated: behavior regulation, social interaction, and joint attention (Bruner, 1983).ResultsResults indicated that both age and setting have a significant effect on frequency of gesture use and frequency of gesture is correlated to receptive and expressive language abilities; however, these relationships are dependent upon the gesture type examined.ConclusionsThese findings further our understanding of the relationship between gesture use and language and support the concept that frequency of gesture is related to language abilities. This is meaningful because gestures are one of the first forms of intentional communication, allowing for early identification of language abilities at a young age.  相似文献   

10.
手势是交流互动中一种重要的非语言媒介, 手势不仅可以辅助语言交流而且具有独立的交流性; 作为和语言共同发生的非语言媒介, 手势交流有助于降低交流认知负荷。文章重点归纳和述评了基于手势和语言表达关系的交流手势理论、交流手势激活理论、交流手势的认知节省理论。未来研究需要进一步考虑交流手势实验研究情境自然性和控制严格性间的平衡, 交流手势和其他非语言因素间的关系, 交流手势认知研究的现实意义。  相似文献   

11.
In development, children often use gesture to communicate before they use words. The question is whether these gestures merely precede language development or are fundamentally tied to it. We examined 10 children making the transition from single words to two-word combinations and found that gesture had a tight relation to the children's lexical and syntactic development. First, a great many of the lexical items that each child produced initially in gesture later moved to that child's verbal lexicon. Second, children who were first to produce gesture-plus-word combinations conveying two elements in a proposition (point at bird and say "nap") were also first to produce two-word combinations ("bird nap"). Changes in gesture thus not only predate but also predict changes in language, suggesting that early gesture may be paving the way for future developments in language.  相似文献   

12.
People gesture a great deal when speaking, and research has shown that listeners can interpret the information contained in gesture. The current research examines whether learners can also use co‐speech gesture to inform language learning. Specifically, we examine whether listeners can use information contained in an iconic gesture to assign meaning to a novel verb form. Two experiments demonstrate that adults and 2‐, 3‐, and 4‐year‐old children can infer the meaning of novel intransitive verbs from gestures when no other source of information is present. The findings support the idea that gesture might be a source of input available to language learners.  相似文献   

13.
Talking and Thinking With Our Hands   总被引:1,自引:0,他引:1  
ABSTRACT— When people talk, they gesture. Typically, gesture is produced along with speech and forms a fully integrated system with that speech. However, under unusual circumstances, gesture can be produced on its own, without speech. In these instances, gesture must take over the full burden of communication usually shared by the two modalities. What happens to gesture in this very different context? One possibility is that there are no differences in the forms gesture takes with speech and without it—that gesture is gesture no matter what its function. But that is not what we find. When gesture is produced on its own and assumes the full burden of communication, it takes on a language-like form. In contrast, when gesture is produced in conjunction with speech and shares the burden of communication with that speech, it takes on an unsegmented, imagistic form, often conveying information not found in speech. As such, gesture sheds light on how people think and can even play a role in changing those thoughts. Gesture can thus be part of language or it can itself be language, altering its form to fit its function.  相似文献   

14.
Form and function in early communication: language and pointing gestures   总被引:2,自引:0,他引:2  
Pointing gestures of verbally advanced 2-year-olds were contrasted with those of less advanced peers, in order to examine the relationships of gesture to language during the acquisition of each. Hypotheses regarding the replacement of gestural functions by speech as verbal skills improve, regarding developmental correspondences between the two communicative domains, and regarding the independence of language acquisition from nonverbal developments were drawn from evolutionary, structuralist, and nativist viewpoints, respectively. Both formal and functional aspects of each communicative skill were measured, and were shown to be largely unrelated, particularly in the gestural domain. No evidence that language replaced gesture for communication in ontogeny was obtained. Correspondences between gesture and language occurred only between functional aspects of each, and the independence of developing language from gestural advances was suggested by the findings.  相似文献   

15.
Children achieve increasingly complex language milestones initially in gesture or in gesture+speech combinations before they do so in speech, from first words to first sentences. In this study, we ask whether gesture continues to be part of the language-learning process as children begin to develop more complex language skills, namely narratives. A key aspect of narrative development is tracking story referents, specifying who did what to whom. Adults track referents primarily in speech by introducing a story character with a noun phrase and then following the same referent with a pronoun—a strategy that presents challenges for young children. We ask whether young children can track story referents initially in communications that combine gesture and speech by using character viewpoint in gesture to introduce new story characters, before they are able to do so exclusively in speech using nouns followed by pronouns. Our analysis of 4- to 6-year-old children showed that children introduced new characters in gesture+speech combinations with character viewpoint gestures at an earlier age than conveying the same referents exclusively in speech with the use of nominal phrases followed by pronouns. Results show that children rely on viewpoint in gesture to convey who did what to whom as they take their first steps into narratives.  相似文献   

16.
Speakers convey meaning not only through words, but also through gestures. Although children are exposed to co-speech gestures from birth, we do not know how the developing brain comes to connect meaning conveyed in gesture with speech. We used functional magnetic resonance imaging (fMRI) to address this question and scanned 8- to 11-year-old children and adults listening to stories accompanied by hand movements, either meaningful co-speech gestures or meaningless self-adaptors. When listening to stories accompanied by both types of hand movement, both children and adults recruited inferior frontal, inferior parietal, and posterior temporal brain regions known to be involved in processing language not accompanied by hand movements. There were, however, age-related differences in activity in posterior superior temporal sulcus (STSp), inferior frontal gyrus, pars triangularis (IFGTr), and posterior middle temporal gyrus (MTGp) regions previously implicated in processing gesture. Both children and adults showed sensitivity to the meaning of hand movements in IFGTr and MTGp, but in different ways. Finally, we found that hand movement meaning modulates interactions between STSp and other posterior temporal and inferior parietal regions for adults, but not for children. These results shed light on the developing neural substrate for understanding meaning contributed by co-speech gesture.  相似文献   

17.
殷融 《心理科学进展》2020,28(7):1141-1155
语言进化是进化心理学研究领域的重要问题。镜像系统假说、工具制造假说与传授假说从不同角度对手部动作与语言进化间的关系进行了解释, 三种假说都认为人类语言起源于手部动作经验。相关实证研究发现:手语与口语具有一致性特征、语言与手部动作具有共同的神经基础、手势发展可以预测语言发展水平以及手势可以提高工具制造知识的传播效率, 这些研究为三种假说的具体观点提供了实证支持。未来该领域的研究需要关注手势语与口语在进化中的发展关系, 以及人类语言进化与其他认知特征的进化关系。  相似文献   

18.
19.
Previous studies have shown that bilingual adults use more gestures than English monolinguals. Because no study has compared the gestures of bilinguals and monolinguals in both languages, the high gesture rate could be due to transfer from a high gesture language or could result from the use of gesture to aid in linguistic access. In this study we tried to distinguish between those causes by comparing the gesture rate of 10 French–English bilingual preschoolers with both 10 French and 10 English monolinguals. All were between 4 and 6 years of age. The children were asked to watch a cartoon and tell the story back. The results showed the bilingual children gestured more than either group of monolinguals and at the same rate in both French and English. These results suggest that that the bilinguals were not gesturing because they were transferring the high gesture rate from one language to another. We argue that bilinguals might gesture more than monolinguals to help formulate their spoken message.  相似文献   

20.
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre‐specified items to a partner using repeated non‐linguistic vocalization, repeated gesture, or repeated non‐linguistic vocalization plus gesture (but without using their existing language system). Gesture proved more effective (measured by communication success) and more efficient (measured by the time taken to communicate) than non‐linguistic vocalization across a range of item categories (emotion, object, and action). Combining gesture and vocalization did not improve performance beyond gesture alone. We experimentally demonstrate that gesture is a more effective means of bootstrapping a human communication system. We argue that gesture outperforms non‐linguistic vocalization because it lends itself more naturally to the production of motivated signs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号