首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   45篇
  免费   3篇
  国内免费   9篇
  2023年   5篇
  2022年   1篇
  2021年   3篇
  2020年   2篇
  2019年   11篇
  2018年   3篇
  2017年   5篇
  2016年   3篇
  2015年   1篇
  2014年   1篇
  2013年   5篇
  2012年   1篇
  2011年   3篇
  2009年   2篇
  2007年   3篇
  2006年   2篇
  2005年   3篇
  2003年   1篇
  2000年   1篇
  1996年   1篇
排序方式: 共有57条查询结果,搜索用时 15 毫秒
21.
This study investigated whether gesturing classes (baby sign) affected parental frustration and stress, as advertised by many commercial products. The participants were 178 mother–infant dyads, divided into a gesture group (n=89) and a non‐gesture group (n=89), based on whether they had attended baby sign classes or not. Mothers completed a background demographics questionnaire and the Parenting Stress Index. Gesturing mothers had higher total stress scores, with higher scores on the child domain, despite having similar backgrounds to non‐gesturing mothers. There was no relationship between the frequency or duration of gesture use and stress scores. It is suggested that gesturing mothers had higher pre‐existing stress and were attracted to gesture classes because of the promoted benefits, which include stress reduction, although class attendance did not alleviate their stress. The possibility that attending gesturing classes made mothers view their infant in a more negative way, due to their heightened expectations not being met, is also discussed. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   
22.
手势是交流互动中一种重要的非语言媒介, 手势不仅可以辅助语言交流而且具有独立的交流性; 作为和语言共同发生的非语言媒介, 手势交流有助于降低交流认知负荷。文章重点归纳和述评了基于手势和语言表达关系的交流手势理论、交流手势激活理论、交流手势的认知节省理论。未来研究需要进一步考虑交流手势实验研究情境自然性和控制严格性间的平衡, 交流手势和其他非语言因素间的关系, 交流手势认知研究的现实意义。  相似文献   
23.
殷融 《心理科学进展》2020,28(7):1141-1155
语言进化是进化心理学研究领域的重要问题。镜像系统假说、工具制造假说与传授假说从不同角度对手部动作与语言进化间的关系进行了解释, 三种假说都认为人类语言起源于手部动作经验。相关实证研究发现:手语与口语具有一致性特征、语言与手部动作具有共同的神经基础、手势发展可以预测语言发展水平以及手势可以提高工具制造知识的传播效率, 这些研究为三种假说的具体观点提供了实证支持。未来该领域的研究需要关注手势语与口语在进化中的发展关系, 以及人类语言进化与其他认知特征的进化关系。  相似文献   
24.
Natural languages make prolific use of conventional constituent‐ordering patterns to indicate “who did what to whom,” yet the mechanisms through which these regularities arise are not well understood. A series of recent experiments demonstrates that, when prompted to express meanings through silent gesture, people bypass native language conventions, revealing apparent biases underpinning word order usage, based on the semantic properties of the information to be conveyed. We extend the scope of these studies by focusing, experimentally and computationally, on the interpretation of silent gesture. We show cross‐linguistic experimental evidence that people use variability in constituent order as a cue to obtain different interpretations. To illuminate the computational principles that govern interpretation of non‐conventional communication, we derive a Bayesian model of interpretation via biased inductive inference and estimate these biases from the experimental data. Our analyses suggest people's interpretations balance the ambiguity that is characteristic of emerging language systems, with ordering preferences that are skewed and asymmetric, but defeasible.  相似文献   
25.
Sighted speakers of different languages vary systematically in how they package and order components of a motion event in speech. These differences influence how semantic elements are organized in gesture, but only when those gestures are produced with speech (co‐speech gesture), not without speech (silent gesture). We ask whether the cross‐linguistic similarity in silent gesture is driven by the visuospatial structure of the event. We compared 40 congenitally blind adult native speakers of English or Turkish (20/language) to 80 sighted adult speakers (40/language; half with, half without blindfolds) as they described three‐dimensional motion scenes. We found an effect of language on co‐speech gesture, not on silent gesture—blind speakers of both languages organized their silent gestures as sighted speakers do. Humans may have a natural semantic organization that they impose on events when conveying them in gesture without language—an organization that relies on neither visuospatial cues nor language structure.  相似文献   
26.
Memory for series of action phrases improves in listeners when speakers accompany each phrase with congruent gestures compared to when speakers stay still. Studies reveal that the listeners’ motor system, at encoding, plays a crucial role in this enactment effect. We present two experiments on gesture observation, which explored the role of the listeners’ motor system at recall. The participants listened to the phrases uttered by a speaker in two conditions in each experiment. In the gesture condition, the speaker uttered the phrases with accompanying congruent gestures, and in the no-gesture condition, the speaker stayed still while uttering the phrases. The participants were then invited, in both conditions of the experiments, to perform a motor task while recalling the phrases proffered by the speaker. The results revealed that the advantage of observing gestures on memory disappears if the listeners move at recall arms and hands (same motor effectors moved by the speaker, Experiment 1a), but not when the listeners move legs and feet (different motor effectors from those moved by the speaker, Experiment 1b). The results suggest that the listeners’ motor system is involved not only during the encoding of action phrases uttered by a speaker but also when recalling these phrases during retrieval.  相似文献   
27.
Twenty-two pairs of typically developing toddlers (M = 24.32 months) and their mothers were observed in a play-room solving puzzles during 30 min. The target of the observations was hand-taking gesture. Researchers have thought that this gesture is rare among typically developing children and is more frequent among autistic children. Ten in 22 children showed this gesture in only 30 min. They should know “I can not do it by myself, but my mother can do it.” When we can assume that children know others’ mental mechanism, it might be the origins of a theory of mind.  相似文献   
28.
The aim of the present study was to examine the comprehension of gesture in a situation in which the communicator cannot (or can only with difficulty) use verbal communication. Based on theoretical considerations, we expected to obtain higher semantic comprehension for emblems (gestures with a direct verbal definition or translation that is well known by all members of a group, or culture) compared to illustrators (gestures regarded as spontaneous and idiosyncratic and that do not have a conventional definition). Based on the extant literature, we predicted higher semantic specificity associated with arbitrarily coded and iconically coded emblems compared to intrinsically coded illustrators. Using a scenario of emergency evacuation, we tested the difference in semantic specificity between different categories of gestures. 138 participants saw 10 videos each illustrating a gesture performed by a firefighter. They were requested to imagine themselves in a dangerous situation and to report the meaning associated with each gesture. The results showed that intrinsically coded illustrators were more successfully understood than arbitrarily coded emblems, probably because the meaning of intrinsically coded illustrators is immediately comprehensible without recourse to symbolic interpretation. Furthermore, there was no significant difference between the comprehension of iconically coded emblems and that of both arbitrarily coded emblems and intrinsically coded illustrators. It seems that the difference between the latter two types of gestures was supported by their difference in semantic specificity, although in a direction opposite to that predicted. These results are in line with those of Hadar and Pinchas‐Zamir (2004), which showed that iconic gestures have higher semantic specificity than conventional gestures.  相似文献   
29.
30.
Both vocalization and gesture are universal modes of communication and fundamental features of language development. The gestural origins theory proposes that language evolved out of early gestural use. However, evidence reported here suggests vocalization is much more prominent in early human communication than gesture is. To our knowledge no prior research has investigated the rates of emergence of both gesture and vocalization across the first year in human infants. We evaluated the rates of gestures and speech-like vocalizations (protophones) in 10 infants at 4, 7, and 11 months of age using parent-infant laboratory recordings. We found that infant protophones outnumbered gestures substantially at all three ages, ranging from >35 times more protophones than gestures at 3 months, to >2.5 times more protophones than gestures at 11 months. The results suggest vocalization, not gesture, is the predominant mode of communication in human infants in the first year.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号