首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   45篇
  免费   3篇
  国内免费   9篇
  2023年   5篇
  2022年   1篇
  2021年   3篇
  2020年   2篇
  2019年   11篇
  2018年   3篇
  2017年   5篇
  2016年   3篇
  2015年   1篇
  2014年   1篇
  2013年   5篇
  2012年   1篇
  2011年   3篇
  2009年   2篇
  2007年   3篇
  2006年   2篇
  2005年   3篇
  2003年   1篇
  2000年   1篇
  1996年   1篇
排序方式: 共有57条查询结果,搜索用时 31 毫秒
1.
The origin and functions of the hand and arm gestures that accompany speech production are poorly understood. It has been proposed that gestures facilitate lexical retrieval, but little is known about when retrieval is accompanied by gestural activity and how this activity is related to the semantics of the word to be retrieved. Electromyographic (EMG) activity of the dominant forearm was recorded during a retrieval task in which participants tried to identify target words from their definitions. EMG amplitudes were significantly greater for concrete than for abstract words. The relationship between EMG amplitude and other conceptual attributes of the target words was examined. EMG was positively related to a word’s judged spatiality, concreteness, drawability, and manipulability. The implications of these findings for theories of the relation between speech production and gesture are discussed.This experiment was done by the first author under the supervision of the second author in partial completion of the Ph.D. degree at Columbia University. We gratefully acknowledge the advice and comments of Lois Putnam, Robert Remez, James Magnuson, Michele Miozzo, and Robert B. Tallarico, and the assistance of Stephen Krieger, Lauren Walsh, Jennifer Kim, and Jillian White.  相似文献   
2.
Studies of great apes have revealed that they use manual gestures and other signals to communicate about distal objects. There is also evidence that chimpanzees modify the types of communicative signals they use depending on the attentional state of a human communicative partner. The majority of previous studies have involved chimpanzees requesting food items from a human experimenter. Here, these same communicative behaviors are reported in chimpanzees requesting a tool from a human observer. In this study, captive chimpanzees were found to gesture, vocalize, and display more often when the experimenter had a tool than when she did not. It was also found that chimpanzees responded differentially based on the attentional state of a human experimenter, and when given the wrong tool persisted in their communicative efforts. Implications for the referential and intentional nature of chimpanzee communicative signaling are discussed.  相似文献   
3.
A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a “frame” (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a “last item” belonging to one of four categories: a high-close-probability sign (a “semantically reasonable” completion to the sentence; e.g. BED), a low-close-probability sign (a real sign that is nonetheless a “semantically odd” completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity.  相似文献   
4.
In human face-to-face communication, language comprehension is a multi-modal, situated activity. However, little is known about how we combine information from different modalities during comprehension, and how perceived communicative intentions, often signaled through visual signals, influence this process. We explored this question by simulating a multi-party communication context in which a speaker alternated her gaze between two recipients. Participants viewed speech-only or speech + gesture object-related messages when being addressed (direct gaze) or unaddressed (gaze averted to other participant). They were then asked to choose which of two object images matched the speaker’s preceding message. Unaddressed recipients responded significantly more slowly than addressees for speech-only utterances. However, perceiving the same speech accompanied by gestures sped unaddressed recipients up to a level identical to that of addressees. That is, when unaddressed recipients’ speech processing suffers, gestures can enhance the comprehension of a speaker’s message. We discuss our findings with respect to two hypotheses attempting to account for how social eye gaze may modulate multi-modal language comprehension.  相似文献   
5.
The level of motivation (i.e. incentive power) is thought to be one of the most important factors affecting performance and learning in various tasks. We investigated whether reward quality has an effect on the performance of family dogs in a two-way object choice test in which they can find the hidden food by relying on distal momentary human pointing cues. In three experiments we varied (1) the type of food reward according to the subjects’ own preference; (2) the quality of the reward offered at the same time in the indicated and not-indicated locations; and (3) the order of the high or low quality rewards in consecutive sessions. In Experiment 1, we first tested whether dogs prefer one kind of reward over another. Then one group was tested with the ‘preferred’ food as reward in the indicated bowl, while dogs in the other group received the ‘non-preferred’ food as reward. We did not find any difference between the performance and choice latencies of the two groups. In Experiment 2 for the first group, the indicated bowl contained a piece of carrot and the not-indicated bowl was empty. In the second group the indicated bowl contained carrot, but the not-indicated bowl contained sausage. According to a preliminary preference test, most dogs prefer sausage over carrot invariably. After 20 trials, the two groups performed surprisingly similarly. There was no difference found between groups in the number of correct choices, incorrect choices and non-choices. However, the comparison between the first and last five trials revealed that subjects who found sausage when they chose the not-indicated bowl (did not follow the pointing) chose the non-indicated bowl significantly more often toward the end of their test session. In Experiment 3, each dog received two sessions with 12 pointing trials in each. For the first session, one group was rewarded with sausage and the other with carrot upon choosing the indicated bowl. In the second session, the indicated bowl contained dry dog food for both groups. We found that correct choices and response latencies did not change over two sessions in the ‘sausage’ group. In the ‘carrot’ group, the dogs chose faster in the second session, but their performance did not improve; in fact, they chose the not-indicated bowl more often than the indicated bowl. As a conclusion, we can say that reward quality had some effect on dogs’ choice behavior in these experiments. The drop in their performance was not drastic, taking into account the general refusal to eat one of the ‘rewards’ (carrot) during the preference tests and also during the test trials. It seems that incentive contrast may play a relatively minor role in dog-human social interactions. Appropriate reward quality can be very important in asocial problem solving tasks, but, when interacting with humans, following human signals may override the effect of changed incentive power.  相似文献   
6.
Our preferences are sensitive to social influences. For instance, we like more the objects that are looked-at by others than non-looked-at objects. Here, we explored this liking effect, using a modified paradigm of attention cueing by gaze. First, we investigated if the liking effect induced by gaze relied on motoric representations of the target object by testing if the liking effect could be observed for non-manipulable (alphanumeric characters) as well as for manipulable items (common tools). We found a significant liking effect for the alphanumeric items. Second, we tested if another type of powerful social cue could also induce a liking effect. We used an equivalent paradigm but with pointing hands instead of gaze cues. Pointing hands elicited a robust attention-orienting effect, but they did not induce any significant liking effect. This study extends previous findings and reinforces the view of eye gaze as a special cue in human interactions.  相似文献   
7.
Infant signs are intentionally taught/learned symbolic gestures which can be used to represent objects, actions, requests, and mental state. Through infant signs, parents and infants begin to communicate specific concepts earlier than children’s first spoken language. This study examines whether cultural differences in language are reflected in children’s and parents’ use of infant signs. Parents speaking East Asian languages with their children utilize verbs more often than do English-speaking mothers; and compared to their English-learning peers, Chinese children are more likely to learn verbs as they first acquire spoken words. By comparing parents’ and infants’ use of infant signs in the U.S. and Taiwan, we investigate cultural differences of noun/object versus verb/action bias before children’s first language. Parents reported their own and their children's use of first infant signs retrospectively. Results show that cultural differences in parents’ and children’s infant sign use were consistent with research on early words, reflecting cultural differences in communication functions (referential versus regulatory) and child-rearing goals (independent versus interdependent). The current study provides evidence that intergenerational transmission of culture through symbols begins prior to oral language.  相似文献   
8.
Extensive research shows that caregivers’ speech and gestures can scaffold children’s learning. This study examines whether caregivers increase the amount of spoken and gestural instruction when a task becomes difficult for children. We also examine whether increasing the amount of instruction containing both speech and gestures enhances children’s problem-solving. Ninety-three 3- to 4-year-old Chinese children and their caregivers participated in our study. The children tried to assemble two jigsaw puzzles (with 12 pieces in one and 20 in the other); each puzzle was attempted in three phases. The order in which the puzzles were to be solved was randomized. In Phases 1 and 3, the children tried to solve the puzzles alone. In Phase 2, the children received instruction from their caregivers. The children assembled a smaller proportion of the 20-piece puzzle than of the 12-piece one, suggesting that the 20-piece puzzle was more difficult than the 12-piece one. The caregivers produced more spoken and gestural instruction for the 20-piece than for the 12-piece puzzle. The proportion of the instruction employing both speech and gesture (+InstS+InstG) was significantly greater for the 20-piece puzzle than for the 12-piece puzzle. More importantly, the children who received more instruction with +InstS+InstG performed better in solving the 20-piece puzzle than those who received less instruction of the same type. Those who did not receive +InstS+InstG instruction performed less successfully in Phase 3. However, the facilitating effect of instruction with +InstS+InstG was not found with the 12-piece puzzle. Our findings suggest that adults should incorporate speech and gesture in their instruction as frequently as possible when teaching their children to perform a difficult task.  相似文献   
9.
语言伴随性手势是人类语言交流的一个普遍的特征, 它可以发挥信息交流的功能。依据产生目的以及适用范围的不同, 手势可以分成表象性手势和非表象性手势。多数研究者认为, 语言和手势动作是“近亲”, 具有“家族相似性”。来自语言发展、认知心理学和认知神经科学的证据均表明, 手势和语言共享同一个交流系统。当手势动作和发音单词意义相同时, 手势动作受到单词发音的干扰, 同时引发发音共振峰2 (F2)的放大。手势和言语之间遵循互相作用理论, 它们作用的基础是语义一致性, 镜像神经系统完成了两者语义一致性的传递。由于语言伴随性手势和语言之间的关系是语言和行为结合良好的特例, 因此, 对语言伴随性手势的研究将有助于对人类心智有一个更深层次的认识和理解。  相似文献   
10.
Gesture Reflects Language Development: Evidence From Bilingual Children   总被引:1,自引:0,他引:1  
There is a growing awareness that language and gesture are deeply intertwined in the spontaneous expression of adults. Although some research suggests that children use gesture independently of speech, there is scant research on how language and gesture develop in children older than 2 years. We report here on a longitudinal investigation of the relation between gesture and language development in French-English bilingual children from 2 to 3 1/2 years old. The specific gesture types of iconics and beats correlated with the development of the children's two languages, whereas pointing types of gestures generally did not. The onset of iconic and beat gestures coincided with the onset of sentencelike utterances separately in each of the children's two languages. The findings show that gesture is related to language development rather than being independent from it. Contrasting theories about how gesture is related to language development are discussed.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号