首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Previous research has shown differences in monolingual and bilingual communication. We explored whether monolingual and bilingual pre‐schoolers (N = 80) differ in their ability to understand others' iconic gestures (gesture perception) and produce intelligible iconic gestures themselves (gesture production) and how these two abilities are related to differences in parental iconic gesture frequency. In a gesture perception task, the experimenter replaced the last word of every sentence with an iconic gesture. The child was then asked to choose one of four pictures that matched the gesture as well as the sentence. In a gesture production task, children were asked to indicate ‘with their hands’ to a deaf puppet which objects to select. Finally, parental gesture frequency was measured while parents answered three different questions. In the iconic gesture perception task, monolingual and bilingual children did not differ. In contrast, bilinguals produced more intelligible gestures than their monolingual peers. Finally, bilingual children's parents gestured more while they spoke than monolingual children's parents. We suggest that bilinguals' heightened sensitivity to their interaction partner supports their ability to produce intelligible gestures and results in a bilingual advantage in iconic gesture production.  相似文献   

2.
Both vocalization and gesture are universal modes of communication and fundamental features of language development. The gestural origins theory proposes that language evolved out of early gestural use. However, evidence reported here suggests vocalization is much more prominent in early human communication than gesture is. To our knowledge no prior research has investigated the rates of emergence of both gesture and vocalization across the first year in human infants. We evaluated the rates of gestures and speech-like vocalizations (protophones) in 10 infants at 4, 7, and 11 months of age using parent-infant laboratory recordings. We found that infant protophones outnumbered gestures substantially at all three ages, ranging from >35 times more protophones than gestures at 3 months, to >2.5 times more protophones than gestures at 11 months. The results suggest vocalization, not gesture, is the predominant mode of communication in human infants in the first year.  相似文献   

3.
This study explores a common assumption made in the cognitive development literature that children will treat gestures as labels for objects. Without doubt, researchers in these experiments intend to use gestures symbolically as labels. The present studies examine whether children interpret these gestures as labels. In Study 1 two-, three-, and four-year olds tested in a training paradigm learned gesture–object pairs for both iconic and arbitrary gestures. Iconic gestures became more accurate with age, while arbitrary gestures did not. Study 2 tested the willingness of children aged 40–60 months to fast map novel nouns, iconic gestures and arbitrary gestures to novel objects. Children used fast mapping to choose objects for novel nouns, but treated gesture as an action associate, looking for an object that could perform the action depicted by the gesture. They were successful with iconic gestures but chose objects randomly for arbitrary gestures and did not fast map. Study 3 tested whether this effect was a result of the framing of the request and found that results did not change regardless of whether the request was framed with a deictic phrase (“this one 〈gesture〉”) or an article (“a 〈gesture〉”). Implications for preschool children’s understanding of iconicity, and for their default interpretations of gesture are discussed.  相似文献   

4.
What aspects of infants’ prelinguistic communication are most valuable for learning to speak, and why? We test whether early vocalizations and gestures drive the transition to word use because, in addition to indicating motoric readiness, they (a) are early instances of intentional communication and (b) elicit verbal responses from caregivers. In study 1, 11 month olds (N = 134) were observed to coordinate vocalizations and gestures with gaze to their caregiver's face at above chance rates, indicating that they are plausibly intentionally communicative. Study 2 tested whether those infant communicative acts that were gaze‐coordinated best predicted later expressive vocabulary. We report a novel procedure for predicting vocabulary via multi‐model inference over a comprehensive set of infant behaviours produced at 11 and 12 months (n = 58). This makes it possible to establish the relative predictive value of different behaviours that are hierarchically organized by level of granularity. Gaze‐coordinated vocalizations were the most valuable predictors of expressive vocabulary size up to 24 months. Study 3 established that caregivers were more likely to respond to gaze‐coordinated behaviours. Moreover, the dyadic combination of infant gaze‐coordinated vocalization and caregiver response was by far the best predictor of later vocabulary size. We conclude that practice with prelinguistic intentional communication facilitates the leap to symbol use. Learning is optimized when caregivers respond to intentional vocalizations with appropriate language.  相似文献   

5.
We report on a study investigating 3–5‐year‐old children's use of gesture to resolve lexical ambiguity. Children were told three short stories that contained two homonym senses; for example, bat (flying mammal) and bat (sports equipment). They were then asked to re‐tell these stories to a second experimenter. The data were coded for the means that children used during attempts at disambiguation: speech, gesture, or a combination of the two. The results indicated that the 3‐year‐old children rarely disambiguated the two senses, mainly using deictic pointing gestures during attempts at disambiguation. In contrast, the 4‐year‐old children attempted to disambiguate the two senses more often, using a larger proportion of iconic gestures than the other children. The 5‐year‐old children used less iconic gestures than the 4‐year‐olds, but unlike the 3‐year‐olds, were able to disambiguate the senses through the verbal channel. The results highlight the value of gesture to the development of children's language and communication skills.  相似文献   

6.
When asked to explain their solutions to a problem, children often gesture and, at times, these gestures convey information that is different from the information conveyed in speech. Children who produce these gesture‐speech “mismatches” on a particular task have been found to profit from instruction on that task. We have recently found that some children produce gesture‐speech mismatches when identifying numbers at the cusp of their knowledge, for example, a child incorrectly labels a set of two objects with the word “three” and simultaneously holds up two fingers. These mismatches differ from previously studied mismatches (where the information conveyed in gesture has the potential to be integrated with the information conveyed in speech) in that the gestured response contradicts the spoken response. Here, we ask whether these contradictory number mismatches predict which learners will profit from number‐word instruction. We used the Give‐a‐Number task to measure number knowledge in 47 children (Mage = 4.1 years, SD = 0.58), and used the What's on this Card task to assess whether children produced gesture‐speech mismatches above their knower level. Children who were early in their number learning trajectories (“one‐knowers” and “two‐knowers”) were then randomly assigned, within knower level, to one of two training conditions: a Counting condition in which children practiced counting objects; or an Enriched Number Talk condition containing counting, labeling set sizes, spatial alignment of neighboring sets, and comparison of these sets. Controlling for counting ability, we found that children were more likely to learn the meaning of new number words in the Enriched Number Talk condition than in the Counting condition, but only if they had produced gesture‐speech mismatches at pretest. The findings suggest that numerical gesture‐speech mismatches are a reliable signal that a child is ready to profit from rich number instruction and provide evidence, for the first time, that cardinal number gestures have a role to play in number‐learning.  相似文献   

7.
Children's gesture production precedes and predicts language development, but the pathways linking these domains are unclear. It is possible that gesture production assists in children's developing word comprehension, which in turn supports expressive vocabulary acquisition. The present study examines this mediation pathway in a population with variability in early communicative abilities—the younger siblings of children with autism spectrum disorder (ASD; high‐risk infants, HR). Participants included 92 HR infants and 28 infants at low risk (LR) for ASD. A primary caregiver completed the MacArthur‐Bates Communicative Development Inventory (Fenson, et al., 1993) at 12, 14, and 18 months, and HR infants received a diagnostic evaluation for ASD at 36 months. Word comprehension at 14 months mediated the relationship between 12‐month gesture and 18‐month word production in LR and HR infants (ab = 0.263; p < 0.01). For LR infants and HR infants with no diagnosis or language delay, gesture was strongly associated with word comprehension (as = 0.666; 0.646; 0.561; ps < 0.01). However, this relationship did not hold for infants later diagnosed with ASD (a = 0.073; p = 0.840). This finding adds to a growing literature suggesting that children with ASD learn language differently. Furthermore, this study provides an initial step toward testing the developmental pathways by which infants transition from early actions and gestures to expressive language.  相似文献   

8.
Co‐thought gestures are understudied as compared to co‐speech gestures yet, may provide insight into cognitive functions of gestures that are independent of speech processes. A recent study with adults showed that co‐thought gesticulation occurred spontaneously during mental preparation of problem solving. Moreover, co‐thought gesturing (either spontaneous or instructed) during mental preparation was effective for subsequent solving of the Tower of Hanoi under conditions of high cognitive load (i.e., when visual working memory capacity was limited and when the task was more difficult). In this preregistered study ( https://osf.io/dreks/ ), we investigated whether co‐thought gestures would also spontaneously occur and would aid problem‐solving processes in children (N = 74; 8–12 years old) under high load conditions. Although children also spontaneously used co‐thought gestures during mental problem solving, this did not aid their subsequent performance when physically solving the problem. If these null results are on track, co‐thought gesture effects may be different in adults and children.  相似文献   

9.
Children can understand iconic co-speech gestures that characterize entities by age 3 (Stanfield et al. in J Child Lang 40(2):1–10, 2014; e.g., “I’m drinking” \(+\) tilting hand in C-shape to mouth as if holding a glass). In this study, we ask whether children understand co-speech gestures that characterize events as early as they do so for entities, and if so, whether their understanding is influenced by the patterns of gesture production in their native language. We examined this question by studying native English speaking 3- to 4 year-old children and adults as they completed an iconic co-speech gesture comprehension task involving motion events across two studies. Our results showed that children understood iconic co-speech gestures about events at age 4, marking comprehension of gestures about events one year later than gestures about entities. Our findings also showed that native gesture production patterns influenced children’s comprehension of gestures characterizing such events, with better comprehension for gestures that follow language-specific patterns compared to the ones that do not follow such patterns—particularly for manner of motion. Overall, these results highlight early emerging abilities in gesture comprehension about motion events.  相似文献   

10.
Performing action has been found to have a greater impact on learning than observing action. Here we ask whether a particular type of action – the gestures that accompany talk – affect learning in a comparable way. We gave 158 6‐year‐old children instruction in a mental transformation task. Half the children were asked to produce a Move gesture relevant to the task; half were asked to produce a Point gesture. The children also observed the experimenter producing either a Move or Point gesture. Children who produced a Move gesture improved more than children who observed the Move gesture. Neither producing nor observing the Point gesture facilitated learning. Doing gesture promotes learning better than seeing gesture, as long as the gesture conveys information that could help solve the task.  相似文献   

11.
Previous research has established that gesture observation aids learning in children. The current study examined whether observation of gestures (i.e. depictive and tracing gestures) differentially affected verbal and visual–spatial retention when learning a route and its street names. Specifically, we explored whether children (n = 97) with lower visual and verbal working‐memory capacity benefited more from observing gestures as compared with children who score higher on these traits. To this end, 11‐ to 13‐year‐old children were presented with an instructional video of a route containing no gestures, depictive gestures, tracing gestures or both depictive and tracing gestures. Results indicated that the type of observed gesture affected performance: Observing tracing gestures or both tracing and depictive gestures increased performance on route retention, while observing depictive gestures or both depictive and tracing gestures increased performance on street name retention. These effects were not differentially affected by working‐memory capacity. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

12.
People with aphasia use gestures not only to communicate relevant content but also to compensate for their verbal limitations. The Sketch Model (De Ruiter, 2000) assumes a flexible relationship between gesture and speech with the possibility of a compensatory use of the two modalities. In the successor of the Sketch Model, the AR-Sketch Model (De Ruiter, 2017), the relationship between iconic gestures and speech is no longer assumed to be flexible and compensatory, but instead iconic gestures are assumed to express information that is redundant to speech. In this study, we evaluated the contradictory predictions of the Sketch Model and the AR-Sketch Model using data collected from people with aphasia as well as a group of people without language impairment. We only found compensatory use of gesture in the people with aphasia, whereas the people without language impairments made very little compensatory use of gestures. Hence, the people with aphasia gestured according to the prediction of the Sketch Model, whereas the people without language impairment did not. We conclude that aphasia fundamentally changes the relationship of gesture and speech.  相似文献   

13.
The gestures children produce predict the early stages of spoken language development. Here we ask whether gesture is a global predictor of language learning, or whether particular gestures predict particular language outcomes. We observed 52 children interacting with their caregivers at home, and found that gesture use at 18 months selectively predicted lexical versus syntactic skills at 42 months, even with early child speech controlled. Specifically, number of different meanings conveyed in gesture at 18 months predicted vocabulary at 42 months, but number of gesture+speech combinations did not. In contrast, number of gesture+speech combinations, particularly those conveying sentence‐like ideas, produced at 18 months predicted sentence complexity at 42 months, but meanings conveyed in gesture did not. We can thus predict particular milestones in vocabulary and sentence complexity at age by watching how children move their hands two years earlier.  相似文献   

14.
Previous literature has demonstrated cultural differences in young children’s use of communicative gestures, but the results were mixed depending on which gestures were measured and what age of children were involved. This study included variety of different types of gestures and examined whether children’s use of communicative gestures varies by their cultural backgrounds and ages. 714 parents of children (6–36 months old) from U.S.A. English-, German-, and Taiwan Chinese- speaking countries completed the questionnaire on their children’s use of each gesture described in the survey. We used logistic regressions to examine the effect of children’s culture and age, and the interaction effect (culture × age). Children were more likely to use all gestures except reaching, showing, and smacking lips for “yum, yum” as their age increases. In addition, there were gestures that showed significantly different probabilities across children’s cultural backgrounds. A significant interaction effect was shown for five gestures: reaching, showing, pointing, arms up to be picked up, and “quiet” gesture. Results suggest that the influence of culture on young children’s communication emerges from infancy.  相似文献   

15.
Gesture and early bilingual development   总被引:1,自引:0,他引:1  
The relationship between speech and gestural proficiency was investigated longitudinally (from 2 years to 3 years 6 months, at 6-month intervals) in 5 French-English bilingual boys with varying proficiency in their 2 languages. Because of their different levels of proficiency in the 2 languages at the same age, these children's data were used to examine the relative contribution of language and cognitive development to gestural development. In terms of rate of gesture production, rate of gesture production with speech, and meaning of gesture and speech, the children used gestures much like adults from 2 years on. In contrast, the use of iconic and beat gestures showed differential development in the children's 2 languages as a function of mean length of utterance. These data suggest that the development of these kinds of gestures may be more closely linked to language development than other kinds (such as points). Reasons why this might be so are discussed.  相似文献   

16.
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre‐specified items to a partner using repeated non‐linguistic vocalization, repeated gesture, or repeated non‐linguistic vocalization plus gesture (but without using their existing language system). Gesture proved more effective (measured by communication success) and more efficient (measured by the time taken to communicate) than non‐linguistic vocalization across a range of item categories (emotion, object, and action). Combining gesture and vocalization did not improve performance beyond gesture alone. We experimentally demonstrate that gesture is a more effective means of bootstrapping a human communication system. We argue that gesture outperforms non‐linguistic vocalization because it lends itself more naturally to the production of motivated signs.  相似文献   

17.
We studied how gesture use changes with culture, age and increased spoken language competence. A picture-naming task was presented to British (N = 80) and Finnish (N = 41) typically developing children aged 2–5 years. British children were found to gesture more than Finnish children and, in both cultures, gesture production decreased after the age of two. Two-year-olds used more deictic than iconic gestures than older children, and gestured more before the onset of speech, rather than simultaneously or after speech. The British 3- and 5-year-olds gestured significantly more when naming praxic (manipulable) items than non-praxic items. Our results support the view that gesture serves a communicative and intrapersonal function, and the relative function may change with age. Speech and language therapists and psychologists observe the development of children’s gestures and make predictions on the basis of their frequency and type. To prevent drawing erroneous conclusions about children’s linguistic development, it is important to understand developmental and cultural variations in gesture use.  相似文献   

18.
Comparative analysis of the gestural communication of our nearest animal relatives, the great apes, implies that humans should have the biological potential to produce and understand 60–70 gestures, by virtue of shared common descent. These gestures are used intentionally in apes to convey separate requests, rather than as referential items in syntactically structured signals. At present, no such legacy of shared gesture has been described in humans. We suggest that the fate of “ape gestures” in modern human communication is relevant to the debate regarding the evolution of language through a possible intermediate stage of gestural protolanguage.  相似文献   

19.
Gesture Reflects Language Development: Evidence From Bilingual Children   总被引:1,自引:0,他引:1  
There is a growing awareness that language and gesture are deeply intertwined in the spontaneous expression of adults. Although some research suggests that children use gesture independently of speech, there is scant research on how language and gesture develop in children older than 2 years. We report here on a longitudinal investigation of the relation between gesture and language development in French-English bilingual children from 2 to 3 1/2 years old. The specific gesture types of iconics and beats correlated with the development of the children's two languages, whereas pointing types of gestures generally did not. The onset of iconic and beat gestures coincided with the onset of sentencelike utterances separately in each of the children's two languages. The findings show that gesture is related to language development rather than being independent from it. Contrasting theories about how gesture is related to language development are discussed.  相似文献   

20.
Gesture–speech synchrony re‐stabilizes when hand movement or speech is disrupted by a delayed feedback manipulation, suggesting strong bidirectional coupling between gesture and speech. Yet it has also been argued from case studies in perceptual–motor pathology that hand gestures are a special kind of action that does not require closed‐loop re‐afferent feedback to maintain synchrony with speech. In the current pre‐registered within‐subject study, we used motion tracking to conceptually replicate McNeill's ( 1992 ) classic study on gesture–speech synchrony under normal and 150 ms delayed auditory feedback of speech conditions (NO DAF vs. DAF). Consistent with, and extending McNeill's original results, we obtain evidence that (a) gesture‐speech synchrony is more stable under DAF versus NO DAF (i.e., increased coupling effect), (b) that gesture and speech variably entrain to the external auditory delay as indicated by a consistent shift in gesture‐speech synchrony offsets (i.e., entrainment effect), and (c) that the coupling effect and the entrainment effect are co‐dependent. We suggest, therefore, that gesture–speech synchrony provides a way for the cognitive system to stabilize rhythmic activity under interfering conditions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号