首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Objective

To examine whether or not university mathematics students semantically process gestures depicting mathematical functions (mathematical gestures) similarly to the way they process action gestures and sentences. Semantic processing was indexed by the N400 effect.

Results

The N400 effect elicited by words primed with mathematical gestures (e.g. “converging” and “decreasing”) was the same in amplitude, latency and topography as that elicited by words primed with action gestures (e.g. drive and lift), and that for terminal words of sentences.

Significance and conclusion

Findings provide a within-subject demonstration that the topographies of the gesture N400 effect for both action and mathematical words are indistinguishable from that of the standard language N400 effect. This suggests that mathematical function words are processed by the general language semantic system and do not appear to involve areas involved in other mathematical concepts (e.g. numerosity).  相似文献   

2.
To assess priming by iconic gestures, we recorded EEG (at 29 scalp sites) in two experiments while adults watched short, soundless videos of spontaneously produced, cospeech iconic gestures followed by related or unrelated probe words. In Experiment 1, participants classified the relatedness between gestures and words. In Experiment 2, they attended to stimuli, and performed an incidental recognition memory test on words presented during the EEG recording session. Event-related potentials (ERPs) time-locked to the onset of probe words were measured, along with response latencies and word recognition rates. Although word relatedness did not affect reaction times or recognition rates, contextually related probe words elicited less-negative ERPs than did unrelated ones between 300 and 500 msec after stimulus onset (N400) in both experiments. These findings demonstrate sensitivity to semantic relations between iconic gestures and words in brain activity engendered during word comprehension.  相似文献   

3.
Variation in how frequently caregivers engage with their children is associated with variation in children's later language outcomes. One explanation for this link is that caregivers use both verbal behaviors, such as labels, and non-verbal behaviors, such as gestures, to help children establish reference to objects or events in the world. However, few studies have directly explored whether language outcomes are more strongly associated with referential behaviors that are expressed verbally, such as labels, or non-verbally, such as gestures, or whether both are equally predictive. Here, we observed caregivers from 42 Spanish-speaking families in the US engage with their 18-month-old children during 5-min lab-based, play sessions. Children's language processing speed and vocabulary size were assessed when children were 25 months. Bayesian model comparisons assessed the extent to which the frequencies of caregivers’ referential labels, referential gestures, or labels and gestures together, were more strongly associated with children's language outcomes than a model with caregiver total words, or overall talkativeness. The best-fitting models showed that children who heard more referential labels at 18 months were faster in language processing and had larger vocabularies at 25 months. Models including gestures, or labels and gestures together, showed weaker fits to the data. Caregivers’ total words predicted children's language processing speed, but predicted vocabulary size less well. These results suggest that the frequency with which caregivers of 18-month-old children use referential labels, more so than referential gestures, is a critical feature of caregiver verbal engagement that contributes to language processing development and vocabulary growth.

Research Highlights

  • We examined the frequency of referential communicative behaviors, via labels and/or gestures, produced by caregivers during a 5-min play interaction with their 18-month-old children.
  • We assessed predictive relations between labels, gestures, their combination, as well as total words spoken, and children's processing speed and vocabulary growth at 25 months.
  • Bayesian model comparisons showed that caregivers’ referential labels at 18 months best predicted both 25-month vocabulary measures, although total words also predicted later processing speed.
  • Frequent use of referential labels by caregivers, more so than referential gestures, is a critical feature of communicative behavior that supports children's later vocabulary learning.
  相似文献   

4.
This study explored children’s development in comprehending four types of pointing gestures with different familiarity. Our aim was to highlight human infants’ pointing comprehension abilities under the same conditions used for various animal species. Sixteen children were tested longitudinally in a two-choice task from 1 year of age. At the age of 12 and 14 months, infants did not exceed chance level with either of the gestures used. Infants were successful with distal pointing and long cross-pointing at the age of 16 months. By the age of 18 months, infants showed a high success rate with the less familiar gestures (forward cross-pointing and far pointing) as well. Their skills at this older age show close similarity with those demonstrated previously by dogs when using exactly the same testing procedures. Our longitudinal studies also revealed that in a few infants, the ability to comprehend pointing gestures is already apparent before 16 months of age. In general, we found large individual variation. This has been described for a variety of cognitive skills in human development and seems to be typical for pointing comprehension as well.  相似文献   

5.
Wu YC  Coulson S 《Brain and language》2007,101(3):234-245
EEG was recorded as adults watched short segments of spontaneous discourse in which the speaker's gestures and utterances contained complementary information. Videos were followed by one of four types of picture probes: cross-modal related probes were congruent with both speech and gestures; speech-only related probes were congruent with information in the speech, but not the gesture; and two sorts of unrelated probes were created by pairing each related probe with a different discourse prime. Event-related potentials (ERPs) elicited by picture probes were measured within the time windows of the N300 (250-350 ms post-stimulus) and N400 (350-550 ms post-stimulus). Cross-modal related probes elicited smaller N300 and N400 than speech-only related ones, indicating that pictures were easier to interpret when they corresponded with gestures. N300 and N400 effects were not due to differences in the visual complexity of each probe type, since the same cross-modal and speech-only picture probes elicited N300 and N400 with similar amplitudes when they appeared as unrelated items. These findings extend previous research on gesture comprehension by revealing how iconic co-speech gestures modulate conceptualization, enabling listeners to better represent visuo-spatial aspects of the speaker's meaning.  相似文献   

6.
The recognition of iconic correspondence between signal and referent has been argued to bootstrap the acquisition and emergence of language. Here, we study the ontogeny, and to some extent the phylogeny, of the ability to spontaneously relate iconic signals, gestures, and/or vocalizations, to previous experience. Children at 18, 24, and 36 months of age (N = 216) and great apes (N = 13) interacted with two apparatuses, each comprising a distinct action and sound. Subsequently, an experimenter mimicked either the action, the sound, or both in combination to refer to one of the apparatuses. Experiments 1 and 2 found no spontaneous comprehension in great apes and in 18‐month‐old children. At 24 months of age, children were successful with a composite vocalization‐gesture signal but not with either vocalization or gesture alone. At 36 months, children succeeded both with a composite vocalization‐gesture signal and with gesture alone, but not with vocalization alone. In general, gestures were understood better compared to vocalizations. Experiment 4 showed that gestures were understood irrespective of how children learned about the corresponding action (through observation or self‐experience). This pattern of results demonstrates that iconic signals can be a powerful way to establish reference in the absence of language, but they are not trivial for children to comprehend and not all iconic signals are created equal.  相似文献   

7.
Combinations of different sensory words produce mismatch expressions like smooth color and red touch in contrast to normal expressions like red color and smooth touch. Concerning these sensory mismatch expressions, results of three experiments are reported. Experiment 1 revealed that (i) mismatch expressions were less comprehensible than normal expressions, and that (ii) there were two patterns among mismatch expressions: the high-comprehensible mismatch expression (HighCME, e.g., smooth color) and the low-comprehensible mismatch expression (LowCME, e.g., red touch). Experiment 2 revealed that the mismatch expressions produced a significantly greater N400 amplitude than the normal expressions. Experiment 3 implied that the difference between High/LowCME was reflected in a later latency band or in a topographical difference of N400, although the statistical significance was marginal. It is argued that the processes to integrate linguistic elements (e.g., combining adjectives and nouns) are not homogeneous.  相似文献   

8.
In recent years, studies have suggested that gestures influence comprehension of linguistic expressions, for example, eliciting an N400 component in response to a speech/gesture mismatch. In this paper, we investigate the role of gestural information in the understanding of metaphors. Event related potentials (ERPs) were recorded while participants viewed video clips of an actor uttering metaphorical expressions and producing bodily gestures that were congruent or incongruent with the metaphorical meaning of such expressions. This modality of stimuli presentation allows a more ecological approach to meaning integration. When ERPs were calculated using gesture stroke as time-lock event, gesture incongruity with metaphorical expression modulated the amplitude of the N400 and of the late positive complex (LPC). This suggests that gestural and speech information are combined online to make sense of the interlocutor’s linguistic production in an early stage of metaphor comprehension. Our data favor the idea that meaning construction is globally integrative and highly context-sensitive.  相似文献   

9.
Liu Y  Shu H  Wei J 《Brain and language》2006,96(1):37-48
Two event-related potential (ERP) experiments were conducted to investigate spoken word recognition in Chinese and the effect of contextual constraints on this process. In Experiment 1, three kinds of incongruous words were formed by altering the first, second or both syllables of the congruous disyllabic terminal words in high constraint spoken sentences. Results showed an increase of N400 amplitude in all three incongruous word conditions and a delayed N400 effect in the cohort incongruous condition as compared with the rhyme incongruous and plain incongruous condition. In addition, unlike results in English, we found that the N400 effect in the rhyme incongruous condition disappeared earlier than in the plain incongruous condition. In Experiment 2, three kinds of nonwords derived from sentence congruous words were constructed by altering few or many phonetic features of the onset or the whole of the first syllable, and the resulting nonwords appeared as disyllabic terminal forms in either high or low constraint sentences. All three nonword conditions elicited the N400 component. In addition, in high constraint sentences but not in low, the amplitude and duration of the N400 varied as a function of the degree of phonetic mismatch between the terminal nonword and the expected congruous word.  相似文献   

10.
This paper describes the emergence and development of three object-related gestures: pointing, extending objects, and open-handed reaching, in four first-born infants from 9 to 18 months during natural interactions with their mothers. It examines the changing characteristics of the gestures and the acquisition of conventional words in accompaniment. Furthermore, it investigates the role that the capacity for dual-directional signaling, sending simultaneously two coordinated but divergently directed nonverbal signals of gesture and gaze, may play in this transition. Analysis revealed that dual-directional signaling appeared concurrently across gestures. In addition, dual-directional signaling was employed in a socially adjusted manner, more with pointing, especially spontaneous pointing when the mothers' attention could not be assumed. Verbal accompaniments appeared with gestures only when the children had mastered dual-directional signaling. Then words emerged approximately simultaneously with more than one kind of gesture.  相似文献   

11.
This study deals with the adjustment of requests to the communication situation at two stages of development: the end of the prelinguistic period (18 months) and the beginning of the linguistic period (30 months). The main objective is to point out how language acquisition introduces new modalities of adjustment. The production of two groups of 12 children (18 and 30 months) were compared in three situations of object requesting: (a) the adult complies with the request (satisfaction), (b) the adult asks a clarification question (clarification), (c) the adult refuses to comply with the request (refusal). The similarities observed suggest a continuity between the two ages with respect to the functional aspects of requests. As for the structural aspects, the results reveal a partial continuity between the prelinguistic children (18 months) and the linguistic children (30 months): at the age of 18 months, children use vocalizations as they use words at 30 months and gestures are used similarly at both ages. The differences observed between both age groups suggest that, at the age of 30 months, language is used in order to find a solution to the request situation, yet without any major changes in the communicative strategies. The results of this study relativize the revolution brought about by language in the child's behaviours and specify its contribution to the adjustment of messages as a function of the communication situation.  相似文献   

12.
Research has shown a close relationship between gestures and language development. In this study, we investigate the cross-lagged relationships between different types of gestures and two lexicon dimensions: number of words produced and comprehended. Information about gestures and lexical development was collected from 48 typically developing infants when these were aged 0;9, 1;0 and 1;3. The European Portuguese version of the MacArthur–Bates Communicative Development Inventory: Words and Gestures (PT CDI:WG) was used. The results indicated that the total number of actions and gestures and the number of early gestures produced at 0;9 and at 1;0 year predicted the number of words comprehended three months later. Actions and gestures’ predictive power of the number of words produced was limited to the 0;9–1;0 year interval. The opposite relationship was not found: word comprehension and production did not predict action and gestures three months later. These results highlight the importance of non-verbal communicative behavior in language development.  相似文献   

13.
Perea, Duñabeitia, and Carreiras (Journal of Experimental Psychology: Human Perception and Performance 34:237–241, 2008) found that LEET stimuli, formed by a mixture of digits and letters (e.g., T4BL3 instead of TABLE), produced priming effects similar to those for regular words. This finding led them to conclude that LEET stimuli automatically activate lexical information. In the present study, we examined whether semantic activation occurs for LEET stimuli by using an electrophysiological measure called the N400 effect. The N400 effect, also known as the mismatch negativity, reflects detection of a mismatch between a word and the current semantic context. This N400 effect could occur only if the LEET stimulus had been identified and processed semantically. Participants determined whether a stimulus (word or LEET) was related to a given category (e.g., APPLE or 4PPL3 belongs to the category “fruit,” but TABLE or T4BL3 does not). We found that LEET stimuli produced an N400 effect similar in magnitude to that for regular uppercase words, suggesting that LEET stimuli can access meaning in a manner similar to words presented in consistent uppercase letters.  相似文献   

14.
Conceptual integration and metaphor: an event-related potential study   总被引:3,自引:0,他引:3  
Event-related brain potentials (ERPs) were recorded from 18 normal adults as they read sentences that ended with words used literally, metaphorically, or in an intermediate literal mapping condition. In the latter condition, the literal sense of the word was used in a way that prompted readers to map conceptual structure from a different domain. ERPs measured from 300 to 500 msec after the onset of the sentence-final words differed as a function of metaphoricity: Literal endings elicited the smallest N400, metaphors the largest N400, whereas literal mappings elicited an N400 of intermediate amplitude. Metaphoric endings also elicited a larger posterior positivity than did either literal or literal mapping words. Consistent with conceptual blending theory, the results suggest that the demands of conceptual integration affect the difficulty of both literal and metaphorical language.  相似文献   

15.
Healthy subjects performed a lexical decision task in a semantic priming paradigm while event-related potentials (ERPs) were recorded from 64 channels. Semantic distance between prime and target was varied by including directly, indirectly, and nonrelated word pairs. At centro-parietal electrodes an N400 to nonrelated pairs was elicited bilaterally which was sensitive only to direct, but not to indirect semantic priming. These N400 priming effects were mirrored by the RT data. At inferior fronto-temporal sites directly related words showed ERP priming effects over both hemispheres. However, indirectly related words only elicited ERP priming effects over the right hemisphere. These results support the hypothesis that the right hemisphere semantic system is involved in processing of remote semantic information.  相似文献   

16.
Gesture and language are tightly connected during the development of a child's communication skills. Gestures mostly precede and define the way of language development; even opposite direction has been found. Few recent studies have focused on the relationship between specific gestures and specific word categories, emphasising that the onset of one gesture type predicts the onset of certain word categories or of the earliest word combinations.The aim of this study was to analyse predicative roles of different gesture types on the onset of first word categories in a child's early expressive vocabulary. Our data show that different types of gestures predict different types of word production. Object gestures predict open-class words from the age of 13 months, and gestural routines predict closed-class words and social terms from 8 months. Receptive vocabulary has a strong mediating role for all linguistically defined categories (open- and closed-class words) but not for social terms, which are the largest word category in a child's early expressive vocabulary. Accordingly, main contribution of this study is to define the impact of different gesture types on early expressive vocabulary and to determine the role of receptive vocabulary in gesture-expressive vocabulary relation in the Croatian language.  相似文献   

17.
The present study used event-related potentials (ERPs) to determine the degree to which people can process words while devoting central attention to another task. Experiments 1-4 measured the N400 effect, which is sensitive to the degree of mismatch between a word and the current semantic context. Experiment 5 measured the P3 difference between low- and high-frequency words. Because these effects can occur only if a word has been identified, both ERP components index word processing. The authors found that the N400 effect (Experiments 1, 3, and 4) and the P3 difference (Experiment 5) were strongly attenuated for Task 2 words presented nearly simultaneously with Task 1. No such attenuation was found when the Task 1 stimulus was presented but required no response (Experiment 2). Strong attenuation was also evident when the Task 2 word was presented before the Task 1 stimulus (Experiment 4), suggesting that central resources are not allocated to stimuli first-come, first-served but rather are strategically locked to Task 1. The authors conclude that visual word processing is not fully automatic but rather requires access to limited central attentional resources.  相似文献   

18.
王振宏  姚昭 《心理学报》2012,44(2):154-165
词汇的具体性和情绪性是影响词汇加工的不同因素, 高具体性和情绪性都能够促进词汇的加工。本研究同时操纵词汇的具体性和情绪性, 使用词汇判定任务和愉悦度判断任务, 探讨了情绪名词的具体性效应及其具体性效应是否受词汇情绪信息的影响。结果发现:情绪名词的具体性效应受内隐或外显情绪条件的影响, 具体的情绪词比抽象的情绪词反应时间更短、正确率更高, 诱发了更大的N400和减小的LPC, 但LPC的具体性效应只表现在内隐情绪任务中。词汇的具体性和情绪性的相互影响发生在内隐情绪任务中的语义加工阶段, 正性、负性的具体词和抽象词的加工在N400成分上差异不显著, 而中性具体词和抽象词在N400成分上差异显著, 说明词汇的情绪信息为抽象词的加工提供了充分的语境, 因此消除了具体词的加工优势。  相似文献   

19.
One of the most fascinating phenomena in early development is that babies not only understand signs others direct to them and later use them to communicate with others, but they also come to direct the same signs towards themselves in a private way. Private gestures become "tools of thought". There is a considerable literature about private language, but almost nothing about private gestures. Private gestures pose an intriguing communicative puzzle: they are communicative, but with the self. In this paper we study two types of private gestures (signs) before language: (1) private ostensive gestures and (2) private pointing gestures. We show in a case study of one child between 12 and 18 months of age that both are used with a self-reflexive function, as a way of "thinking" what to do, in order to solve a problem in the conventional use of an object. The private gestures become self-reflexive signs.  相似文献   

20.
To assess predictive relations between joint attention skills, intention understanding, and mental state vocabulary, 88 children were tested with measures of comprehension of gaze and referential pointing, as well as the production of declarative gestures and the comprehension and production of imperative gestures, at the ages of 7-18 months. Infants' intention-based imitation skills were assessed at 12, 15, and 18 months. At the ages of 24 and 36 months, toddlers' internal state lexicon was evaluated by parents with a German adaptation of the Mental State Language Questionnaire (Olineck & Poulin-Dubois, 2005). Regression analyses revealed that 9-months-olds' comprehension of referential pointing contributed significantly to the prediction of intention-based imitation skills at 15 months, as well as to children's volition and cognition vocabularies at 24 and 36 months, respectively. Moreover, 12-month-olds' comprehension of an imperative motive was shown to selectively predict toddlers' use of volition terms at 24 months. Overall, these results provide empirical evidence for both general and specific developmental relations between preverbal communication skills and mental state language, thus implying developmental continuity within the social domain in the first 3 years of life.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号