首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Visible embodiment: Gestures as simulated action   总被引:1,自引:0,他引:1  
Spontaneous gestures that accompany speech are related to both verbal and spatial processes. We argue that gestures emerge from perceptual and motor simulations that underlie embodied language and mental imagery. We first review current thinking about embodied cognition, embodied language, and embodied mental imagery. We then provide evidence that gestures stem from spatial representations and mental images. We then propose the gestures-as-simulated-action framework to explain how gestures might arise from an embodied cognitive system. Finally, we compare this framework with other current models of gesture production, and we briefly outline predictions that derive from the framework.  相似文献   

2.
王辉  李广政 《心理科学进展》2021,29(9):1617-1627
手势是在交流或认知过程中产生的、不直接作用于物体的手部运动, 具有具体性和抽象性。其分类主要从手势的来源、手势的内容、手势的意图及手势和言语的匹配性等角度进行划分。不同类型手势在出现时间及发展趋势上存在差异。手势在儿童词汇学习、言语表达、数学问题解决、空间学习及记忆等方面起促进作用, 但对言语理解的影响未得出一致结论。未来可关注不同类型手势与儿童认知发展的关系及对比不同来源手势在各学习领域的优势情况。  相似文献   

3.
A growing body of evidence suggests that human language may have emerged primarily in the gestural rather than vocal domain, and that studying gestural communication in great apes is crucial to understanding language evolution. Although manual and bodily gestures are considered distinct at a neural level, there has been very limited consideration of potential differences at a behavioural level. In this study, we conducted naturalistic observations of adult wild East African chimpanzees (Pan troglodytes schweinfurthii) in order to establish a repertoire of gestures, and examine intentionality of gesture production, use and comprehension, comparing across manual and bodily gestures. At the population level, 120 distinct gesture types were identified, consisting of 65 manual gestures and 55 bodily gestures. Both bodily and manual gestures were used intentionally and effectively to attain specific goals, by signallers who were sensitive to recipient attention. However, manual gestures differed from bodily gestures in terms of communicative persistence, indicating a qualitatively different form of behavioural flexibility in achieving goals. Both repertoire size and frequency of manual gesturing were more affiliative than bodily gestures, while bodily gestures were more antagonistic. These results indicate that manual gestures may have played a significant role in the emergence of increased flexibility in great ape communication and social bonding.  相似文献   

4.
This study examined the use of sensory modalities relative to a partner’s behavior in gesture sequences during captive chimpanzee play at the Chimpanzee and Human Communication Institute. We hypothesized that chimpanzees would use visual gestures toward attentive recipients and auditory/tactile gestures toward inattentive recipients. We also hypothesized that gesture sequences would be more prevalent toward unresponsive rather than responsive recipients. The chimpanzees used significantly more auditory/tactile rather than visual gestures first in sequences with both attentive and inattentive recipients. They rarely used visual gestures toward inattentive recipients. Auditory/tactile gestures were effective with and used with both attentive and inattentive recipients. Recipients responded significantly more to single gestures than to first gestures in sequences. Sequences often indicated that recipients did not respond to initial gestures, whereas effective single gestures made more gestures unnecessary. The chimpanzees thus gestured appropriately relative to a recipient’s behavior and modified their interactions according to contextual social cues.  相似文献   

5.
This study looks at whether there is a relationship between mother and infant gesture production. Specifically, it addresses the extent of articulation in the maternal gesture repertoire and how closely it supports the infant production of gestures. Eight Spanish mothers and their 1‐ and 2‐year‐old babies were studied during 1 year of observations. Maternal and child verbal production, gestures and actions were recorded at their homes on five occasions while performing daily routines. Results indicated that mother and child deictic gestures (pointing and instrumental) and representational gestures (symbolic and social) were very similar at each age group and did not decline across groups. Overall, deictic gestures were more frequent than representational gestures. Maternal adaptation to developmental changes is specific for gesturing but not for acting. Maternal and child speech were related positively to mother and child pointing and representational gestures, and negatively to mother and child instrumental gestures. Mother and child instrumental gestures were positively related to action production, after maternal and child speech was partialled out. Thus, language plays an important role for dyadic communicative activities (gesture–gesture relations) but not for dyadic motor activities (gesture–action relations). Finally, a comparison of the growth curves across sessions showed a closer correspondence for mother–child deictic gestures than for representational gestures. Overall, the results point to the existence of an articulated maternal gesture input that closely supports the child gesture production. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

6.
This article describes the distribution and development of handedness for manual gestures in captive chimpanzees. Data on handedness for unimanual gestures were collected in a sample of 227 captive chimpanzees. Handedness for these gestures was compared with handedness for three other measures of hand use: tool use, reaching, and coordinated bimanual actions. Chimpanzees were significantly more right-handed for gestures than for all other measures of hand use. Hand use for simple reaching at 3 to 4 years of age predicted hand use for gestures 10 years later. Use of the right hand for gestures was significantly higher when gestures were accompanied by a vocalization than when they were not. The collective results suggest that left-hemisphere specialization for language may have evolved initially from asymmetries in manual gestures in the common ancestor of chimpanzees and humans, rather than from hand use associated with other, non-communicative motor actions, including tool use and coordinated bimanual actions, as has been previously suggested in the literature.  相似文献   

7.
The aim of the present study is to compare the pragmatic ability of right- and left-hemisphere-damaged patients excluding the possible interference of linguistic deficits. To this aim, we study extralinguistic communication, that is communication performed only through gestures. The Cognitive Pragmatics Theory provides the theoretical framework: it predicts a gradient of difficulty in the comprehension of different pragmatic phenomena, that should be valid independently of the use of language or gestures as communicative means. An experiment involving 10 healthy individuals, 10 right- and 9 left-hemisphere-damaged patients, shows that pragmatic performance is better preserved in left-hemisphere-damaged (LHD) patients than in right-hemisphere-damaged (RHD) patients.  相似文献   

8.
Research has shown a close relationship between gestures and language development. In this study, we investigate the cross-lagged relationships between different types of gestures and two lexicon dimensions: number of words produced and comprehended. Information about gestures and lexical development was collected from 48 typically developing infants when these were aged 0;9, 1;0 and 1;3. The European Portuguese version of the MacArthur–Bates Communicative Development Inventory: Words and Gestures (PT CDI:WG) was used. The results indicated that the total number of actions and gestures and the number of early gestures produced at 0;9 and at 1;0 year predicted the number of words comprehended three months later. Actions and gestures’ predictive power of the number of words produced was limited to the 0;9–1;0 year interval. The opposite relationship was not found: word comprehension and production did not predict action and gestures three months later. These results highlight the importance of non-verbal communicative behavior in language development.  相似文献   

9.
Infants younger than 20 months of age interpret both words and symbolic gestures as object names. Later in development words and gestures take on divergent communicative functions. Here, we examined patterns of brain activity to words and gestures in typically developing infants at 18 and 26 months of age. Event-related potentials (ERPs) were recorded during a match/mismatch task. At 18 months, an N400 mismatch effect was observed for pictures preceded by both words and gestures. At 26 months the N400 effect was limited to words. The results provide the first neurobiological evidence showing developmental changes in semantic processing of gestures.  相似文献   

10.
Research has shown that social and symbolic cues presented in isolation and at fixation have strong effects on observers, but it is unclear how cues compare when they are presented away from fixation and embedded in natural scenes. We here compare the effects of two types of social cue (gaze and pointing gestures) and one type of symbolic cue (arrow signs) on eye movements of observers under two viewing conditions (free viewing vs. a memory task). The results suggest that social cues are looked at more quickly, for longer and more frequently than the symbolic arrow cues. An analysis of saccades initiated from the cue suggests that the pointing cue leads to stronger cueing than the gaze and the arrow cue. While the task had only a weak influence on gaze orienting to the cues, stronger cue following was found for free viewing compared to the memory task.  相似文献   

11.
More gestures than answers: children learning about balance   总被引:1,自引:0,他引:1  
  相似文献   

12.
The way adults express manner and path components of a motion event varies across typologically different languages both in speech and cospeech gestures, showing that language specificity in event encoding influences gesture. The authors tracked when and how this multimodal cross-linguistic variation develops in children learning Turkish and English, 2 typologically distinct languages. They found that children learn to speak in language-specific ways from age 3 onward (i.e., English speakers used 1 clause and Turkish speakers used 2 clauses to express manner and path). In contrast, English- and Turkish-speaking children's gestures looked similar at ages 3 and 5 (i.e., separate gestures for manner and path), differing from each other only at age 9 and in adulthood (i.e., English speakers used 1 gesture, but Turkish speakers used separate gestures for manner and path). The authors argue that this pattern of the development of cospeech gestures reflects a gradual shift to language-specific representations during speaking and shows that looking at speech alone may not be sufficient to understand the full process of language acquisition.  相似文献   

13.
Integration of simultaneous auditory and visual information about an event can enhance our ability to detect that event. This is particularly evident in the perception of speech, where the articulatory gestures of the speaker's lips and face can significantly improve the listener's detection and identification of the message, especially when that message is presented in a noisy background. Speech is a particularly important example of multisensory integration because of its behavioural relevance to humans and also because brain regions have been identified that appear to be specifically tuned for auditory speech and lip gestures. Previous research has suggested that speech stimuli may have an advantage over other types of auditory stimuli in terms of audio-visual integration. Here, we used a modified adaptive psychophysical staircase approach to compare the influence of congruent visual stimuli (brief movie clips) on the detection of noise-masked auditory speech and non-speech stimuli. We found that congruent visual stimuli significantly improved detection of an auditory stimulus relative to incongruent visual stimuli. This effect, however, was equally apparent for speech and non-speech stimuli. The findings suggest that speech stimuli are not specifically advantaged by audio-visual integration for detection at threshold when compared with other naturalistic sounds.  相似文献   

14.
Great ape gestural communication is known to be intentional, elaborate and flexible; yet there is controversy over the best interpretation of the system and how gestures are acquired, perhaps because most studies have been made in restricted, captive settings. Here, we report the first systematic analysis of gesture in a population of wild chimpanzees. Over 266 days of observation, we recorded 4,397 cases of intentional gesture use in the Sonso community, Budongo, Uganda. We describe 66 distinct gesture types: this estimate appears close to asymptote, and the Sonso repertoire includes most gestures described informally at other sites. Differences in repertoire were noted between individuals and age classes, but in both cases, the measured repertoire size was predicted by the time subjects were observed gesturing. No idiosyncratic usages were found, i.e. no gesture type was used only by one individual. No support was found for the idea that gestures are acquired by ‘ontogenetic ritualization’ from originally effective actions; moreover, in detailed analyses of two gestures, action elements composing the gestures did not closely match those of the presumed original actions. Rather, chimpanzee gestures are species-typical; indeed, many are ‘family-typical’, because gesture types recorded in gorillas, orangutans and chimpanzee overlap extensively, with 24 gestures recorded in all three genera. Nevertheless, chimpanzee gestures are used flexibly across a range of contexts and show clear adjustment to audience (e.g. silent gestures for attentive targets, contact gestures for inattentive ones). Such highly intentional use of a species-typical repertoire raises intriguing questions for the evolution of advanced communication.  相似文献   

15.
The effects of prohibiting gestures on children's lexical retrieval ability   总被引:1,自引:0,他引:1  
Two alternative accounts have been proposed to explain the role of gestures in thinking and speaking. The Information Packaging Hypothesis (Kita, 2000) claims that gestures are important for the conceptual packaging of information before it is coded into a linguistic form for speech. The Lexical Retrieval Hypothesis (Rauscher, Krauss & Chen, 1996) sees gestures as functioning more at the level of speech production in helping the speaker to find the right words. The latter hypothesis has not been fully explored with children. In this study children were given a naming task under conditions that allowed and restricted gestures. Children named more words correctly and resolved more 'tip-of-the-tongue' states when allowed to gesture than when not, suggesting that gestures facilitate access to the lexicon in children and are important for speech production as well as conceptualization.  相似文献   

16.
Both vocalization and gesture are universal modes of communication and fundamental features of language development. The gestural origins theory proposes that language evolved out of early gestural use. However, evidence reported here suggests vocalization is much more prominent in early human communication than gesture is. To our knowledge no prior research has investigated the rates of emergence of both gesture and vocalization across the first year in human infants. We evaluated the rates of gestures and speech-like vocalizations (protophones) in 10 infants at 4, 7, and 11 months of age using parent-infant laboratory recordings. We found that infant protophones outnumbered gestures substantially at all three ages, ranging from >35 times more protophones than gestures at 3 months, to >2.5 times more protophones than gestures at 11 months. The results suggest vocalization, not gesture, is the predominant mode of communication in human infants in the first year.  相似文献   

17.
Variation in how frequently caregivers engage with their children is associated with variation in children's later language outcomes. One explanation for this link is that caregivers use both verbal behaviors, such as labels, and non-verbal behaviors, such as gestures, to help children establish reference to objects or events in the world. However, few studies have directly explored whether language outcomes are more strongly associated with referential behaviors that are expressed verbally, such as labels, or non-verbally, such as gestures, or whether both are equally predictive. Here, we observed caregivers from 42 Spanish-speaking families in the US engage with their 18-month-old children during 5-min lab-based, play sessions. Children's language processing speed and vocabulary size were assessed when children were 25 months. Bayesian model comparisons assessed the extent to which the frequencies of caregivers’ referential labels, referential gestures, or labels and gestures together, were more strongly associated with children's language outcomes than a model with caregiver total words, or overall talkativeness. The best-fitting models showed that children who heard more referential labels at 18 months were faster in language processing and had larger vocabularies at 25 months. Models including gestures, or labels and gestures together, showed weaker fits to the data. Caregivers’ total words predicted children's language processing speed, but predicted vocabulary size less well. These results suggest that the frequency with which caregivers of 18-month-old children use referential labels, more so than referential gestures, is a critical feature of caregiver verbal engagement that contributes to language processing development and vocabulary growth.

Research Highlights

  • We examined the frequency of referential communicative behaviors, via labels and/or gestures, produced by caregivers during a 5-min play interaction with their 18-month-old children.
  • We assessed predictive relations between labels, gestures, their combination, as well as total words spoken, and children's processing speed and vocabulary growth at 25 months.
  • Bayesian model comparisons showed that caregivers’ referential labels at 18 months best predicted both 25-month vocabulary measures, although total words also predicted later processing speed.
  • Frequent use of referential labels by caregivers, more so than referential gestures, is a critical feature of communicative behavior that supports children's later vocabulary learning.
  相似文献   

18.
This study analyzes the emergent use of gestures used among 9-12-month-old infants with autism and typical development using retrospective video analysis. The purpose of the present investigation was to examine the frequency, initiation, prompting, and diversity of types of gestures used for social interaction purposes. It was hypothesized that a restricted variety in type(s) of gestures as well as fewer child-initiated gestures and more prompted gestures would be associated with later diagnosis of autism. Logistic regression analysis found that decreased variety in type of gestures used was significantly associated with autism status. Neither number of total gestures nor initiation of gestures (child-initiated vs. prompted) was significantly associated with autism status.  相似文献   

19.
Great ape gestures have attracted considerable research interest in recent years, prompted by their flexible and intentional pattern of use; but almost all studies have focused on single gestures. Here, we report the first quantitative analysis of sequential gesture use in western gorillas (Gorilla gorilla gorilla), using data from three captive groups and one African study site. We found no evidence that gesture sequences were given for reasons of increased communicative efficiency over single gestures. Longer sequences of repeated gestures did not increase the likelihood of response, and using a sequence was seldom in reaction to communicative failure. Sequential combination of two gestures with similar meanings did not generally increase effectiveness, and sometimes reduced it. Gesture sequences were closely associated with play contexts. Markov transition analysis showed two networks of frequently co-occurring gestures, both consisting of gestures used to regulate play. One network comprised only tactile gestures, the other a mix of silent, audible and tactile gestures; apparently, these clusters resulted from gesture use in play with proximal or distal contact, respectively. No evidence was found for syntactic effects of sequential combination: meanings changed little or not at all. Semantically, many gestures overlapped massively with others in their core information (i.e. message), and gesture messages spanned relatively few functions. We suggest that the underlying semantics of gorilla gestures is highly simplified compared to that of human words. Gesture sequences allow continual adjustment of the tempo and nature of social interactions, rather than generally conveying semantically referential information or syntactic structures.  相似文献   

20.
In this study the development and alternation of nonreferential gestures were examined longitudinally in terms of the acquisition of Japanese sign language. Parent–child free‐play sessions in their home were videotaped at every monthly visit. Hand activities produced by two deaf infants of deaf parents are described and analyzed. Nonreferential gestures were observed frequently just before the occurrences of the first signs. They consisted of many rhythmic and repetitious movements. Nonreferential gestures became more complex and the number of them also increased as infants grew up. The comparison of nonreferential gestures and first signs revealed the continuity between them in terms of movements. In conclusion, nonreferential gestures are equivalent to a manual analog of vocal babbling.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号