首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 437 毫秒
1.
Typically developing (TD) children refer to objects uniquely in gesture (e.g., point at a cat) before they produce verbal labels for these objects (“cat”). The onset of such gestures predicts the onset of similar spoken words, showing a strong positive relation between early gestures and early words. We asked whether gesture plays the same door-opening role in word learning for children with autism spectrum disorder (ASD) and Down syndrome (DS), who show delayed vocabulary development and who differ in the strength of gesture production. To answer this question, we observed 23 18-month-old TD children, 23 30-month-old children with ASD, and 23 30-month-old children with DS 5 times over a year during parent–child interactions. Children in all 3 groups initially expressed a greater proportion of referents uniquely in gesture than in speech. Many of these unique gestures subsequently entered children’s spoken vocabularies within a year—a pattern that was slightly less robust for children with DS, whose word production was the most markedly delayed. These results indicate that gesture is as fundamental to vocabulary development for children with developmental disorders as it is for TD children.  相似文献   

2.
We studied how gesture use changes with culture, age and increased spoken language competence. A picture-naming task was presented to British (N = 80) and Finnish (N = 41) typically developing children aged 2–5 years. British children were found to gesture more than Finnish children and, in both cultures, gesture production decreased after the age of two. Two-year-olds used more deictic than iconic gestures than older children, and gestured more before the onset of speech, rather than simultaneously or after speech. The British 3- and 5-year-olds gestured significantly more when naming praxic (manipulable) items than non-praxic items. Our results support the view that gesture serves a communicative and intrapersonal function, and the relative function may change with age. Speech and language therapists and psychologists observe the development of children’s gestures and make predictions on the basis of their frequency and type. To prevent drawing erroneous conclusions about children’s linguistic development, it is important to understand developmental and cultural variations in gesture use.  相似文献   

3.
Previous research has found that iconic gestures (i.e., gestures that depict the actions, motions or shapes of entities) identify referents that are also lexically specified in the co-occurring speech produced by proficient speakers. This study examines whether concrete deictic gestures (i.e., gestures that point to physical entities) bear a different kind of relation to speech, and whether this relation is influenced by the language proficiency of the speakers. Two groups of speakers who had different levels of English proficiency were asked to retell a story in English. Their speech and gestures were transcribed and coded. Our findings showed that proficient speakers produced concrete deictic gestures for referents that were not specified in speech, and iconic gestures for referents that were specified in speech, suggesting that these two types of gestures bear different kinds of semantic relations with speech. In contrast, less proficient speakers produced concrete deictic gestures and iconic gestures whether or not referents were lexically specified in speech. Thus, both type of gesture and proficiency of speaker need to be considered when accounting for how gesture and speech are used in a narrative context.  相似文献   

4.
Gesture and early bilingual development   总被引:1,自引:0,他引:1  
The relationship between speech and gestural proficiency was investigated longitudinally (from 2 years to 3 years 6 months, at 6-month intervals) in 5 French-English bilingual boys with varying proficiency in their 2 languages. Because of their different levels of proficiency in the 2 languages at the same age, these children's data were used to examine the relative contribution of language and cognitive development to gestural development. In terms of rate of gesture production, rate of gesture production with speech, and meaning of gesture and speech, the children used gestures much like adults from 2 years on. In contrast, the use of iconic and beat gestures showed differential development in the children's 2 languages as a function of mean length of utterance. These data suggest that the development of these kinds of gestures may be more closely linked to language development than other kinds (such as points). Reasons why this might be so are discussed.  相似文献   

5.
BackgroundAwareness of stuttering is likely to depend upon the development of the metalinguistic skill to discriminate between fluent speech and stuttering and the ability to identify one’s own speech as fluent or stuttered. Presently, little is known about these abilities in individuals with Down syndrome (DS).PurposeThis study investigates whether individuals with DS and typically developing (TD) children who stutter and who do not stutter differ in their ability to discriminate between fluent speech and stuttering. The second purpose of this study is to discover if this ability is correlated with their self-identification ability.MethodAn experiment to investigate awareness with tasks for discrimination of stuttering and self-identification was developed. It was administered to 28 individuals (7–19 years) with DS, 17 of them stutter and 11 do not, and 20 TD children (3–10 years), 8 of them stutter and 12 do not. Skills to discriminate stuttering were compared between these groups and correlated with self-identification within these groups. The influence of stuttering severity and developmental/chronological age on their ability to discriminate was also investigated.ResultsThe ability to discriminate does not differ significantly between the DS and TD group, but is highly influenced by developmental age. This ability correlates with self-identification but only for the TD individuals who speak fluently.ConclusionThe ability to discriminate matures around the age of 7 and conscious awareness may rely on this ability. Differences between the present findings and earlier studies suggest that differentiation in levels and types of awareness is warranted.  相似文献   

6.
The design of effective communications depends upon an adequate model of the communication process. The traditional model is that speech conveys semantic information and bodily movement conveys information about emotion and interpersonal attitudes. But McNeill (2000) argues that this model is fundamentally wrong and that some bodily movements, namely spontaneous hand movements generated during talk (iconic gestures), are integral to semantic communication. But can we increase the effectiveness of communication using this new theory? Focusing on advertising we found that advertisements in which the message was split between speech and iconic gesture (possible on TV) were significantly more effective than advertisements in which meaning resided purely in speech or language (radio/ newspaper). We also found that the significant differences in communicative effectiveness were maintained across five consecutive trials. We compared the communicative power of professionally made TV advertisements in which a spoken message was accompanied either by iconic gestures or by pictorial images, and found the iconic gestures to be more effective. We hypothesized that iconic gestures are so effective because they illustrate and isolate just the core semantic properties of a product. This research suggests that TV advertisements can be made more effective by incorporating iconic gestures with exactly the right temporal and semantic properties.  相似文献   

7.
王辉  李广政 《心理科学进展》2021,29(9):1617-1627
手势是在交流或认知过程中产生的、不直接作用于物体的手部运动, 具有具体性和抽象性。其分类主要从手势的来源、手势的内容、手势的意图及手势和言语的匹配性等角度进行划分。不同类型手势在出现时间及发展趋势上存在差异。手势在儿童词汇学习、言语表达、数学问题解决、空间学习及记忆等方面起促进作用, 但对言语理解的影响未得出一致结论。未来可关注不同类型手势与儿童认知发展的关系及对比不同来源手势在各学习领域的优势情况。  相似文献   

8.
Children can understand iconic co-speech gestures that characterize entities by age 3 (Stanfield et al. in J Child Lang 40(2):1–10, 2014; e.g., “I’m drinking” \(+\) tilting hand in C-shape to mouth as if holding a glass). In this study, we ask whether children understand co-speech gestures that characterize events as early as they do so for entities, and if so, whether their understanding is influenced by the patterns of gesture production in their native language. We examined this question by studying native English speaking 3- to 4 year-old children and adults as they completed an iconic co-speech gesture comprehension task involving motion events across two studies. Our results showed that children understood iconic co-speech gestures about events at age 4, marking comprehension of gestures about events one year later than gestures about entities. Our findings also showed that native gesture production patterns influenced children’s comprehension of gestures characterizing such events, with better comprehension for gestures that follow language-specific patterns compared to the ones that do not follow such patterns—particularly for manner of motion. Overall, these results highlight early emerging abilities in gesture comprehension about motion events.  相似文献   

9.
More gestures than answers: children learning about balance   总被引:1,自引:0,他引:1  
  相似文献   

10.
The present study investigates the hand choice in iconic gestures that accompany speech. In 10 right-handed subjects gestures were elicited by verbal narration and by silent gestural demonstrations of animations with two moving objects. In both conditions, the left-hand was used as often as the right-hand to display iconic gestures. The choice of the right- or left-hands was determined by semantic aspects of the message. The influence of hemispheric language lateralization on the hand choice in co-speech gestures appeared to be minor. Instead, speaking seemed to induce a sequential organization of the iconic gestures.  相似文献   

11.
The effects of prohibiting gestures on children's lexical retrieval ability   总被引:1,自引:0,他引:1  
Two alternative accounts have been proposed to explain the role of gestures in thinking and speaking. The Information Packaging Hypothesis (Kita, 2000) claims that gestures are important for the conceptual packaging of information before it is coded into a linguistic form for speech. The Lexical Retrieval Hypothesis (Rauscher, Krauss & Chen, 1996) sees gestures as functioning more at the level of speech production in helping the speaker to find the right words. The latter hypothesis has not been fully explored with children. In this study children were given a naming task under conditions that allowed and restricted gestures. Children named more words correctly and resolved more 'tip-of-the-tongue' states when allowed to gesture than when not, suggesting that gestures facilitate access to the lexicon in children and are important for speech production as well as conceptualization.  相似文献   

12.
This study explores a common assumption made in the cognitive development literature that children will treat gestures as labels for objects. Without doubt, researchers in these experiments intend to use gestures symbolically as labels. The present studies examine whether children interpret these gestures as labels. In Study 1 two-, three-, and four-year olds tested in a training paradigm learned gesture–object pairs for both iconic and arbitrary gestures. Iconic gestures became more accurate with age, while arbitrary gestures did not. Study 2 tested the willingness of children aged 40–60 months to fast map novel nouns, iconic gestures and arbitrary gestures to novel objects. Children used fast mapping to choose objects for novel nouns, but treated gesture as an action associate, looking for an object that could perform the action depicted by the gesture. They were successful with iconic gestures but chose objects randomly for arbitrary gestures and did not fast map. Study 3 tested whether this effect was a result of the framing of the request and found that results did not change regardless of whether the request was framed with a deictic phrase (“this one 〈gesture〉”) or an article (“a 〈gesture〉”). Implications for preschool children’s understanding of iconicity, and for their default interpretations of gesture are discussed.  相似文献   

13.
After years of walking practice 8-10-year-old children with typical development (TD) and those with Down syndrome (DS) show uniquely different but efficient use of dynamic resources to walk overground and on a treadmill [Ulrich, B.D., Haehl, V., Buzzi, U., Kubo, M., & Holt, K.G. (2004). Modeling dynamic resource utilization in populations with unique constraints: Preadolescents with and without Down syndrome. Human Movement Science, 23, 133-156]. Here we examined the use of global stiffness and angular impulse when walking emerged and across the ensuing months of practice in eight toddlers with TD and eight with DS. Participants visited our lab when first able to walk four to six steps, and at one, three, four, and six months of walking experience. For all visits, toddlers walked overground at their preferred speeds and for the last two visits on a treadmill. Toddlers with TD and DS demonstrated clear and similar developmental trajectories over this period with more similarities than differences between groups. At six months stiffness and impulse values were higher than previously observed for 8-10-year-old children. Stiffness values increased significantly throughout this period, though rate of change slowed for the TD group by three months of experience. Impulse values rose sharply initially and slowed to plateau during the latter months. Treadmill data illustrated toddlers' capacity to adapt dynamic resource use to imposed changes in speed, particularly well after six months of practice. Consistent with our studies of preadolescents and older adults, toddlers with DS produced significantly wider normalized step width than their TD peers. We propose that the challenge of upright bipedal locomotion constrains toddlers with TD and DS to generate similar, necessary and sufficient stiffness and impulse values to walk as they gain control and adapt to playful and self-imposed perturbations of gait over the first six months. The plateau in impulse and slow-down of stiffness increases over the latter months may be the first signs of a downward trend to the lower values produced by older children with several years of walking experience.  相似文献   

14.
15.
Previous research has shown differences in monolingual and bilingual communication. We explored whether monolingual and bilingual pre‐schoolers (N = 80) differ in their ability to understand others' iconic gestures (gesture perception) and produce intelligible iconic gestures themselves (gesture production) and how these two abilities are related to differences in parental iconic gesture frequency. In a gesture perception task, the experimenter replaced the last word of every sentence with an iconic gesture. The child was then asked to choose one of four pictures that matched the gesture as well as the sentence. In a gesture production task, children were asked to indicate ‘with their hands’ to a deaf puppet which objects to select. Finally, parental gesture frequency was measured while parents answered three different questions. In the iconic gesture perception task, monolingual and bilingual children did not differ. In contrast, bilinguals produced more intelligible gestures than their monolingual peers. Finally, bilingual children's parents gestured more while they spoke than monolingual children's parents. We suggest that bilinguals' heightened sensitivity to their interaction partner supports their ability to produce intelligible gestures and results in a bilingual advantage in iconic gesture production.  相似文献   

16.
People with aphasia use gestures not only to communicate relevant content but also to compensate for their verbal limitations. The Sketch Model (De Ruiter, 2000) assumes a flexible relationship between gesture and speech with the possibility of a compensatory use of the two modalities. In the successor of the Sketch Model, the AR-Sketch Model (De Ruiter, 2017), the relationship between iconic gestures and speech is no longer assumed to be flexible and compensatory, but instead iconic gestures are assumed to express information that is redundant to speech. In this study, we evaluated the contradictory predictions of the Sketch Model and the AR-Sketch Model using data collected from people with aphasia as well as a group of people without language impairment. We only found compensatory use of gesture in the people with aphasia, whereas the people without language impairments made very little compensatory use of gestures. Hence, the people with aphasia gestured according to the prediction of the Sketch Model, whereas the people without language impairment did not. We conclude that aphasia fundamentally changes the relationship of gesture and speech.  相似文献   

17.
Gesture Reflects Language Development: Evidence From Bilingual Children   总被引:1,自引:0,他引:1  
There is a growing awareness that language and gesture are deeply intertwined in the spontaneous expression of adults. Although some research suggests that children use gesture independently of speech, there is scant research on how language and gesture develop in children older than 2 years. We report here on a longitudinal investigation of the relation between gesture and language development in French-English bilingual children from 2 to 3 1/2 years old. The specific gesture types of iconics and beats correlated with the development of the children's two languages, whereas pointing types of gestures generally did not. The onset of iconic and beat gestures coincided with the onset of sentencelike utterances separately in each of the children's two languages. The findings show that gesture is related to language development rather than being independent from it. Contrasting theories about how gesture is related to language development are discussed.  相似文献   

18.
Semantically rich learning contexts facilitate semantic, phonological, and articulatory aspects of word learning in children with typical development (TD). However, because children with autism spectrum disorder (ASD) show differences at each of these processing levels, it is unclear whether they will benefit from semantic cues in the same manner as their typical peers. The goal of this study was to track how the inclusion of rich, sparse, or no semantic cues influences semantic, phonological, and articulatory aspects of word learning in children with ASD and TD over time. Twenty‐four school‐aged children (12 in each group), matched on expressive vocabulary, participated in an extended word learning paradigm. Performance on five measures of learning (referent identification, confrontation naming, defining, phonetic accuracy, and speech motor stability) were tracked across three sessions approximately one week apart to assess the influence of semantic richness on extended learning. Results indicate that children with ASD benefit from semantically rich learning contexts similarly to their peers with TD; however, one key difference between the two groups emerged – the children with ASD showed heightened shifts in speech motor stability. These findings offer insights into common learning mechanisms in children with ASD and TD, as well as pointing to a potentially distinct speech motor learning trajectory in children with ASD, providing a window into the emergence of stereotypic vocalizations in these children.  相似文献   

19.
The recognition of iconic correspondence between signal and referent has been argued to bootstrap the acquisition and emergence of language. Here, we study the ontogeny, and to some extent the phylogeny, of the ability to spontaneously relate iconic signals, gestures, and/or vocalizations, to previous experience. Children at 18, 24, and 36 months of age (N = 216) and great apes (N = 13) interacted with two apparatuses, each comprising a distinct action and sound. Subsequently, an experimenter mimicked either the action, the sound, or both in combination to refer to one of the apparatuses. Experiments 1 and 2 found no spontaneous comprehension in great apes and in 18‐month‐old children. At 24 months of age, children were successful with a composite vocalization‐gesture signal but not with either vocalization or gesture alone. At 36 months, children succeeded both with a composite vocalization‐gesture signal and with gesture alone, but not with vocalization alone. In general, gestures were understood better compared to vocalizations. Experiment 4 showed that gestures were understood irrespective of how children learned about the corresponding action (through observation or self‐experience). This pattern of results demonstrates that iconic signals can be a powerful way to establish reference in the absence of language, but they are not trivial for children to comprehend and not all iconic signals are created equal.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号