首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Gesture and early bilingual development   总被引:1,自引:0,他引:1  
The relationship between speech and gestural proficiency was investigated longitudinally (from 2 years to 3 years 6 months, at 6-month intervals) in 5 French-English bilingual boys with varying proficiency in their 2 languages. Because of their different levels of proficiency in the 2 languages at the same age, these children's data were used to examine the relative contribution of language and cognitive development to gestural development. In terms of rate of gesture production, rate of gesture production with speech, and meaning of gesture and speech, the children used gestures much like adults from 2 years on. In contrast, the use of iconic and beat gestures showed differential development in the children's 2 languages as a function of mean length of utterance. These data suggest that the development of these kinds of gestures may be more closely linked to language development than other kinds (such as points). Reasons why this might be so are discussed.  相似文献   

2.
The way adults express manner and path components of a motion event varies across typologically different languages both in speech and cospeech gestures, showing that language specificity in event encoding influences gesture. The authors tracked when and how this multimodal cross-linguistic variation develops in children learning Turkish and English, 2 typologically distinct languages. They found that children learn to speak in language-specific ways from age 3 onward (i.e., English speakers used 1 clause and Turkish speakers used 2 clauses to express manner and path). In contrast, English- and Turkish-speaking children's gestures looked similar at ages 3 and 5 (i.e., separate gestures for manner and path), differing from each other only at age 9 and in adulthood (i.e., English speakers used 1 gesture, but Turkish speakers used separate gestures for manner and path). The authors argue that this pattern of the development of cospeech gestures reflects a gradual shift to language-specific representations during speaking and shows that looking at speech alone may not be sufficient to understand the full process of language acquisition.  相似文献   

3.
We studied how gesture use changes with culture, age and increased spoken language competence. A picture-naming task was presented to British (N = 80) and Finnish (N = 41) typically developing children aged 2–5 years. British children were found to gesture more than Finnish children and, in both cultures, gesture production decreased after the age of two. Two-year-olds used more deictic than iconic gestures than older children, and gestured more before the onset of speech, rather than simultaneously or after speech. The British 3- and 5-year-olds gestured significantly more when naming praxic (manipulable) items than non-praxic items. Our results support the view that gesture serves a communicative and intrapersonal function, and the relative function may change with age. Speech and language therapists and psychologists observe the development of children’s gestures and make predictions on the basis of their frequency and type. To prevent drawing erroneous conclusions about children’s linguistic development, it is important to understand developmental and cultural variations in gesture use.  相似文献   

4.
In development, children often use gesture to communicate before they use words. The question is whether these gestures merely precede language development or are fundamentally tied to it. We examined 10 children making the transition from single words to two-word combinations and found that gesture had a tight relation to the children's lexical and syntactic development. First, a great many of the lexical items that each child produced initially in gesture later moved to that child's verbal lexicon. Second, children who were first to produce gesture-plus-word combinations conveying two elements in a proposition (point at bird and say "nap") were also first to produce two-word combinations ("bird nap"). Changes in gesture thus not only predate but also predict changes in language, suggesting that early gesture may be paving the way for future developments in language.  相似文献   

5.
Previous studies have shown that bilingual adults use more gestures than English monolinguals. Because no study has compared the gestures of bilinguals and monolinguals in both languages, the high gesture rate could be due to transfer from a high gesture language or could result from the use of gesture to aid in linguistic access. In this study we tried to distinguish between those causes by comparing the gesture rate of 10 French–English bilingual preschoolers with both 10 French and 10 English monolinguals. All were between 4 and 6 years of age. The children were asked to watch a cartoon and tell the story back. The results showed the bilingual children gestured more than either group of monolinguals and at the same rate in both French and English. These results suggest that that the bilinguals were not gesturing because they were transferring the high gesture rate from one language to another. We argue that bilinguals might gesture more than monolinguals to help formulate their spoken message.  相似文献   

6.
Sighted speakers of different languages vary systematically in how they package and order components of a motion event in speech. These differences influence how semantic elements are organized in gesture, but only when those gestures are produced with speech (co‐speech gesture), not without speech (silent gesture). We ask whether the cross‐linguistic similarity in silent gesture is driven by the visuospatial structure of the event. We compared 40 congenitally blind adult native speakers of English or Turkish (20/language) to 80 sighted adult speakers (40/language; half with, half without blindfolds) as they described three‐dimensional motion scenes. We found an effect of language on co‐speech gesture, not on silent gesture—blind speakers of both languages organized their silent gestures as sighted speakers do. Humans may have a natural semantic organization that they impose on events when conveying them in gesture without language—an organization that relies on neither visuospatial cues nor language structure.  相似文献   

7.
Gesture and language are tightly connected during the development of a child's communication skills. Gestures mostly precede and define the way of language development; even opposite direction has been found. Few recent studies have focused on the relationship between specific gestures and specific word categories, emphasising that the onset of one gesture type predicts the onset of certain word categories or of the earliest word combinations.The aim of this study was to analyse predicative roles of different gesture types on the onset of first word categories in a child's early expressive vocabulary. Our data show that different types of gestures predict different types of word production. Object gestures predict open-class words from the age of 13 months, and gestural routines predict closed-class words and social terms from 8 months. Receptive vocabulary has a strong mediating role for all linguistically defined categories (open- and closed-class words) but not for social terms, which are the largest word category in a child's early expressive vocabulary. Accordingly, main contribution of this study is to define the impact of different gesture types on early expressive vocabulary and to determine the role of receptive vocabulary in gesture-expressive vocabulary relation in the Croatian language.  相似文献   

8.
The role of gesture in children's learning to count.   总被引:6,自引:0,他引:6  
The role of spontaneous gesture was examined in children's counting and in their assessment of counting accuracy. Eighty-five 2-, 3-, and 4-year-olds counted 6 sets of 2-, 4-, and 6-object arrays. In addition, children assessed the counting accuracy of a puppet whose gestures varied as he counted (i.e., gesture matched the number words, gesture mismatched the number words, no gesture at all). Results showed that the correspondence of children's speech and gesture varied systematically across the age groups and that children adhered to the one-to-one correspondence principle in gesture prior to speech. Moreover, children's correspondence of speech and gesture, adherence to the one-to-one principle in gesture, and assessment of the puppet's counting accuracy were related to children's counting accuracy. Findings are discussed in terms of the role that gesture may play in children's understanding of counting.  相似文献   

9.
More gestures than answers: children learning about balance   总被引:1,自引:0,他引:1  
  相似文献   

10.
When children learn language, they apply their language-learning skills to the linguistic input they receive. But what happens if children are not exposed to input from a conventional language? Do they engage their language-learning skills nonetheless, applying them to whatever unconventional input they have? We address this question by examining gesture systems created by four American and four Chinese deaf children. The children's profound hearing losses prevented them from learning spoken language, and their hearing parents had not exposed them to sign language. Nevertheless, the children in both cultures invented gesture systems that were structured at the morphological/word level. Interestingly, the differences between the children's systems were no bigger across cultures than within cultures. The children's morphemes could not be traced to their hearing mothers' gestures; however, they were built out of forms and meanings shared with their mothers. The findings suggest that children construct morphological structure out of the input that is handed to them, even if that input is not linguistic in form.  相似文献   

11.
Sign languages modulate the production of signs in space and use this spatial modulation to refer back to entities—to maintain coreference. We ask here whether spatial modulation is so fundamental to language in the manual modality that it will be invented by individuals asked to create gestures on the spot. English speakers were asked to describe vignettes under 2 conditions: using gesture without speech, and using speech with spontaneous gestures. When using gesture alone, adults placed gestures for particular entities in non-neutral locations and then used those locations to refer back to the entities. When using gesture plus speech, adults also produced gestures in non-neutral locations but used the locations coreferentially far less often. When gesture is forced to take on the full burden of communication, it exploits space for coreference. Coreference thus appears to be a resilient property of language, likely to emerge in communication systems no matter how simple.  相似文献   

12.
The gestures children produce predict the early stages of spoken language development. Here we ask whether gesture is a global predictor of language learning, or whether particular gestures predict particular language outcomes. We observed 52 children interacting with their caregivers at home, and found that gesture use at 18 months selectively predicted lexical versus syntactic skills at 42 months, even with early child speech controlled. Specifically, number of different meanings conveyed in gesture at 18 months predicted vocabulary at 42 months, but number of gesture+speech combinations did not. In contrast, number of gesture+speech combinations, particularly those conveying sentence‐like ideas, produced at 18 months predicted sentence complexity at 42 months, but meanings conveyed in gesture did not. We can thus predict particular milestones in vocabulary and sentence complexity at age by watching how children move their hands two years earlier.  相似文献   

13.
Previous research has shown differences in monolingual and bilingual communication. We explored whether monolingual and bilingual pre‐schoolers (N = 80) differ in their ability to understand others' iconic gestures (gesture perception) and produce intelligible iconic gestures themselves (gesture production) and how these two abilities are related to differences in parental iconic gesture frequency. In a gesture perception task, the experimenter replaced the last word of every sentence with an iconic gesture. The child was then asked to choose one of four pictures that matched the gesture as well as the sentence. In a gesture production task, children were asked to indicate ‘with their hands’ to a deaf puppet which objects to select. Finally, parental gesture frequency was measured while parents answered three different questions. In the iconic gesture perception task, monolingual and bilingual children did not differ. In contrast, bilinguals produced more intelligible gestures than their monolingual peers. Finally, bilingual children's parents gestured more while they spoke than monolingual children's parents. We suggest that bilinguals' heightened sensitivity to their interaction partner supports their ability to produce intelligible gestures and results in a bilingual advantage in iconic gesture production.  相似文献   

14.
PurposeThe aim of this study was to examine the relationship between frequency of gesture use and language with a consideration for the effect of age and setting on frequency of gesture use in prelinguistic typically developing children.MethodParticipants included a total of 54 typically developing infants and toddlers between the ages of 9 months and 15 months separated into two age ranges, 9-12 months and 12-15 months. All participants were administered the Mullen’s Scale of Early Learning and two gesture samples were obtained: one sample in a structured setting and the other in an unstructured setting. Gesture samples were coded by research assistants blind to the purpose of the research study and total frequency and frequencies for the following gesture types were calculated: behavior regulation, social interaction, and joint attention (Bruner, 1983).ResultsResults indicated that both age and setting have a significant effect on frequency of gesture use and frequency of gesture is correlated to receptive and expressive language abilities; however, these relationships are dependent upon the gesture type examined.ConclusionsThese findings further our understanding of the relationship between gesture use and language and support the concept that frequency of gesture is related to language abilities. This is meaningful because gestures are one of the first forms of intentional communication, allowing for early identification of language abilities at a young age.  相似文献   

15.
16.
Typically developing (TD) children refer to objects uniquely in gesture (e.g., point at a cat) before they produce verbal labels for these objects (“cat”). The onset of such gestures predicts the onset of similar spoken words, showing a strong positive relation between early gestures and early words. We asked whether gesture plays the same door-opening role in word learning for children with autism spectrum disorder (ASD) and Down syndrome (DS), who show delayed vocabulary development and who differ in the strength of gesture production. To answer this question, we observed 23 18-month-old TD children, 23 30-month-old children with ASD, and 23 30-month-old children with DS 5 times over a year during parent–child interactions. Children in all 3 groups initially expressed a greater proportion of referents uniquely in gesture than in speech. Many of these unique gestures subsequently entered children’s spoken vocabularies within a year—a pattern that was slightly less robust for children with DS, whose word production was the most markedly delayed. These results indicate that gesture is as fundamental to vocabulary development for children with developmental disorders as it is for TD children.  相似文献   

17.
Understanding the context for children's social learning and language acquisition requires consideration of caregivers’ multi-modal (speech, gesture) messages. Though young children can interpret both manual and head gestures, little research has examined the communicative input that children receive via parents’ head gestures. We longitudinally examined the frequency and communicative functions of mothers’ head nodding and head shaking gestures during laboratory play sessions for 32 mother–child dyads, when the children were 14, 20, and 30 months of age. The majority of mothers produced head nods more frequently than head shakes. Both gestures contributed to mothers’ verbal attempts at behavior regulation and dialog. Mothers’ head nods primarily conveyed agreement with, and attentiveness to, children's utterances, and accompanied affirmative statements and yes/no questions. Mothers’ head shakes primarily conveyed prohibitions and statements with negations. Changes over time appeared to reflect corresponding developmental changes in social and communicative dimensions of caregiver–child interaction. Directions for future research are discussed regarding the role of head gesture input in socialization and in supporting language development.  相似文献   

18.
Two experiments investigated the relative influence of speech and pointing gesture information in the interpretation of referential acts. Children averaging 3 and 5 years of age and adults viewed a videotape containing the independent manipulation of speech and gestural forms of reference. A man instructed the subjects to choose a ball or a doll by vocally labeling the referent and/or pointing to it. A synthetic speech continuum between two alternatives was crossed with the pointing gesture in a factorial design. Based on research in other domains, it was predicted that all age groups would utilize gestural information, although both speech and gestures were predicted to influence children less than adults. The main effects and interactions of speech and gesture in combination with quantitative models of performance showed the following similarities in information processing between preschoolers and adults: (1) referential evaluation of gestures occurs independently of the evaluation of linguistic reference; (2) speech and gesture are continuous, rather than discrete, sources of information; (3) 5-year-olds and adults combine the two types of information in such a way that the least ambiguous source has the most impact on the judgment. Greater discriminability of both speech and gesture information for adults compared to preschoolers indicated small quantitative progressions with development in the ability to extract and utilize referential signals.  相似文献   

19.
Both vocalization and gesture are universal modes of communication and fundamental features of language development. The gestural origins theory proposes that language evolved out of early gestural use. However, evidence reported here suggests vocalization is much more prominent in early human communication than gesture is. To our knowledge no prior research has investigated the rates of emergence of both gesture and vocalization across the first year in human infants. We evaluated the rates of gestures and speech-like vocalizations (protophones) in 10 infants at 4, 7, and 11 months of age using parent-infant laboratory recordings. We found that infant protophones outnumbered gestures substantially at all three ages, ranging from >35 times more protophones than gestures at 3 months, to >2.5 times more protophones than gestures at 11 months. The results suggest vocalization, not gesture, is the predominant mode of communication in human infants in the first year.  相似文献   

20.
This study looks at whether there is a relationship between mother and infant gesture production. Specifically, it addresses the extent of articulation in the maternal gesture repertoire and how closely it supports the infant production of gestures. Eight Spanish mothers and their 1‐ and 2‐year‐old babies were studied during 1 year of observations. Maternal and child verbal production, gestures and actions were recorded at their homes on five occasions while performing daily routines. Results indicated that mother and child deictic gestures (pointing and instrumental) and representational gestures (symbolic and social) were very similar at each age group and did not decline across groups. Overall, deictic gestures were more frequent than representational gestures. Maternal adaptation to developmental changes is specific for gesturing but not for acting. Maternal and child speech were related positively to mother and child pointing and representational gestures, and negatively to mother and child instrumental gestures. Mother and child instrumental gestures were positively related to action production, after maternal and child speech was partialled out. Thus, language plays an important role for dyadic communicative activities (gesture–gesture relations) but not for dyadic motor activities (gesture–action relations). Finally, a comparison of the growth curves across sessions showed a closer correspondence for mother–child deictic gestures than for representational gestures. Overall, the results point to the existence of an articulated maternal gesture input that closely supports the child gesture production. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号