首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A group of individuals conversing in natural dyads and a group of lecturers were observed for lateral hand movement patterns during speech. Right-handed individuals in both groups displayed a significant right hand bias for gesture movements but no lateral bias for self-touching movements. The study provides external validity for previous laboratory studies of lateralized hand gesture. The results were interpreted as evidence of a central processor for both spoken and gestural communication.  相似文献   

2.
Speakers convey meaning not only through words, but also through gestures. Although children are exposed to co-speech gestures from birth, we do not know how the developing brain comes to connect meaning conveyed in gesture with speech. We used functional magnetic resonance imaging (fMRI) to address this question and scanned 8- to 11-year-old children and adults listening to stories accompanied by hand movements, either meaningful co-speech gestures or meaningless self-adaptors. When listening to stories accompanied by both types of hand movement, both children and adults recruited inferior frontal, inferior parietal, and posterior temporal brain regions known to be involved in processing language not accompanied by hand movements. There were, however, age-related differences in activity in posterior superior temporal sulcus (STSp), inferior frontal gyrus, pars triangularis (IFGTr), and posterior middle temporal gyrus (MTGp) regions previously implicated in processing gesture. Both children and adults showed sensitivity to the meaning of hand movements in IFGTr and MTGp, but in different ways. Finally, we found that hand movement meaning modulates interactions between STSp and other posterior temporal and inferior parietal regions for adults, but not for children. These results shed light on the developing neural substrate for understanding meaning contributed by co-speech gesture.  相似文献   

3.
The gestures that accompany speech are more than just arbitrary hand movements or communicative devices. They are simulated actions that can both prime and facilitate speech and cognition. This study measured participants’ reaction times for naming degraded images of objects when simultaneously adopting a gesture that was either congruent with the target object, incongruent with it, and when not making any hand gesture. A within‐subjects design was used, with participants (N= 122) naming 10 objects under each condition. Participants named the objects significantly faster when adopting a congruent gesture than when not gesturing at all. Adopting an incongruent gesture resulted in significantly slower naming times. The findings are discussed in the context of the intrapersonal cognitive and facilitatory effects of gestures and underline the relatedness between language, action, and cognition.  相似文献   

4.
Intentional and attentional dynamics of speech-hand coordination   总被引:1,自引:0,他引:1  
Interest is rapidly growing in the hypothesis that natural language emerged from a more primitive set of linguistic acts based primarily on manual activity and hand gestures. Increasingly, researchers are investigating how hemispheric asymmetries are related to attentional and manual asymmetries (i.e., handedness). Both speech perception and production have origins in the dynamical generative movements of the vocal tract known as articulatory gestures. Thus, the notion of a "gesture" can be extended to both hand movements and speech articulation. The generative actions of the hands and vocal tract can therefore provide a basis for the (direct) perception of linguistic acts. Such gestures are best described using the methods of dynamical systems analysis since both perception and production can be described using the same commensurate language. Experiments were conducted using a phase transition paradigm to examine the coordination of speech-hand gestures in both left- and right-handed individuals. Results address coordination (in-phase vs. anti-phase), hand (left vs. right), lateralization (left vs. right hemisphere), focus of attention (speech vs. tapping), and how dynamical constraints provide a foundation for human communicative acts. Predictions from the asymmetric HKB equation confirm the attentional basis of functional asymmetry. Of significance is a new understanding of the role of perceived synchrony (p-centres) during intentional cases of gestural coordination.  相似文献   

5.
Motor movements increase the accessibility of the thought content and processes with which they typically co-occur. In two studies, we demonstrate that putting a hand on one’s heart is associated with honesty, both perceived in others and shown in one’s own behavior. Target persons photographed when performing this gesture appeared more trustworthy than the same targets photographed with both hands down (Study 1). Participants who put their hand on their hearts were more willing to admit their lack of knowledge (Study 2), compared to when they performed a neutral gesture. These findings replicate and extend the notion that bodily experience related to abstract concepts of honesty can influence both perceptions of others, and one’s own actions.  相似文献   

6.
Gestural communication is a modality considered in the literature as a candidate for determining the ancestral prerequisites of the emergence of human language. As reported in captive chimpanzees and human children, a study in captive baboons revealed that a communicative gesture elicits stronger degree of right-hand bias than non-communicative actions. It remains unclear if it is the communicative nature of this manual behavior which induces such patterns of handedness. In the present study, we have measured hand use for two uninvestigated behaviors in a group of captive olive baboons: (1) a non-communicative self-touching behavior (“muzzle wipe” serving as a control behavior), (2) another communicative gesture (a ritualized “food beg”) different from the one previously studied in the literature (a species-specific threat gesture, namely “hand slap”) in the same population of baboons. The hand preferences for the “food beg” gestures revealed a trend toward right-handedness and significantly correlated with the hand preferences previously reported in the hand slap gesture within the same baboons. By contrast, the hand preferences for the self-touching behaviors did not reveal any trend of manual bias at a group-level nor correlation with the hand preferences of any communicative gestures. These findings provide additional support to the hypothesized existence in baboons of a specific communicative system involved in the production of communicative gestures that may tend to a left-hemispheric dominance and that may differ from the system involved in purely motor functions. The hypothetical implications of these collective results are discussed within the theoretical framework about the origins of hemispheric specialization for human language.  相似文献   

7.
Chen EE  Small SL 《Brain and language》2007,102(2):176-185
This paper explores how the test-retest reliability is modulated by different groups of participants and experimental tasks. A group of 12 healthy participants and a group of nine stroke patients performed the same language imaging experiment twice, test and retest, on different days. The experiment consists of four conditions, one audio condition and three audiovisual conditions in which the hands are either resting, gesturing, or performing self-adaptive movements. Imaging data were analyzed using multiple linear regression and the results were further used to generate receiver operating characteristic (ROC) curves for each condition for each individual subject. By using area under the curve as a comparison index, we found that stroke patients have less reliability across time than healthy participants, and that when the participants gesture during speech, their imaging data are more reliable than when they are performing hand movements that are not speech-associated. Furthermore, inter-subject variability is less in the gesture task than in any of the other three conditions for healthy participants, but not for stroke patients.  相似文献   

8.
9.
殷融 《心理科学进展》2020,28(7):1141-1155
语言进化是进化心理学研究领域的重要问题。镜像系统假说、工具制造假说与传授假说从不同角度对手部动作与语言进化间的关系进行了解释, 三种假说都认为人类语言起源于手部动作经验。相关实证研究发现:手语与口语具有一致性特征、语言与手部动作具有共同的神经基础、手势发展可以预测语言发展水平以及手势可以提高工具制造知识的传播效率, 这些研究为三种假说的具体观点提供了实证支持。未来该领域的研究需要关注手势语与口语在进化中的发展关系, 以及人类语言进化与其他认知特征的进化关系。  相似文献   

10.
Recent research suggests that an attentional bias toward threat may play a causal role in obsessive-compulsive disorder (OCD) with contamination concerns. However, the attentional components involved in this bias, as well as its behavioral correlates, remain unclear. In the present study, eye movements were recorded in individuals high and low in contamination fear (HCF, LCF, respectively) during 30-s exposures to stimulus arrays containing contamination threat, general threat, pleasant, and neutral images. HCF individuals oriented gaze toward contamination threat more often than LCF individuals in initial fixations, and this bias mediated group differences in responding to a behavioral challenge in a public restroom. No group differences were found in the maintenance of gaze on contamination threat, both in terms of initial gaze encounters, as well as gaze duration over time. However, the HCF group made shorter fixations on contamination threat relative to other image types. The implications of these findings for further delineating the nature and function of attentional biases in contamination-based OCD are discussed.  相似文献   

11.
Tsai JC  Sebanz N  Knoblich G 《Cognition》2011,118(1):135-140
Research on perception–action links has focused on an interpersonal level, demonstrating effects of observing individual actions on performance. The present study investigated perception–action matching at an inter-group level. Pairs of participants responded to hand movements that were performed by two individuals who used one hand each or they responded to hand movements performed by an individual who used both hands. Apart from the difference in the number of observed agents, the observed hand movements were identical. If co-actors form action plans that specify the actions to be performed jointly, then participants should have a stronger tendency to mimic group actions than individual actions. Confirming this prediction, the results showed larger mimicry effects when groups responded to group actions than when groups responded to otherwise identical individual actions. This suggests that representations of joint tasks modulate automatic perception–action links and facilitate mimicry at an inter-group level.  相似文献   

12.
To assess the extent of interaction between lateral biases in response systems at different levels of the neuraxis and detect the possible presence of different patterns of interaction related to population subgroups, we investigated laterality in hand reaching, whole-body turning and eye use in 20 bushbabies (Otolemur garnettii). Two subgroups were clearly identified: the STABLE group was composed of subjects, mainly females, that were consistent in hand preference and had correlation of hand/eye bias; the UNSTABLE group included subjects, mainly males, that showed instability in hand preference as a function of change in test conditions and had correlation of hand/turning bias. Results are interpreted to support the value of the study of interaction between lateral biases as a way of gaining a deeper understanding of the complexity of laterality.  相似文献   

13.
The movements that we make with our body vary continuously along multiple dimensions. However, many of the tools and techniques presently used for coding and analyzing hand gestures and other body movements yield categorical outcome variables. Focusing on categorical variables as the primary quantitative outcomes may mislead researchers or distort conclusions. Moreover, categorical systems may fail to capture the richness present in movement. Variations in body movement may be informative in multiple dimensions. For example, a single hand gesture has a unique size, height of production, trajectory, speed, and handshape. Slight variations in any of these features may alter how both the speaker and the listener are affected by gesture. In this paper, we describe a new method for measuring and visualizing the physical trajectory of movement using video. This method is generally accessible, requiring only video data and freely available computer software. This method allows researchers to examine features of hand gestures, body movement, and other motion, including size, height, curvature, and speed. We offer a detailed account of how to implement this approach, and we also offer some guidelines for situations where this approach may be fruitful in revealing how the body expresses information. Finally, we provide data from a small study on how speakers alter their hand gestures in response to different characteristics of a stimulus to demonstrate the utility of analyzing continuous dimensions of motion. By creating shared methods, we hope to facilitate communication between researchers from varying methodological traditions.  相似文献   

14.
D. M. Jacobs and C. F. Michaels (2006) concluded that aspects of hand movements in lateral catching were predicted by the ratio of lateral optical velocity to expansion velocity. Their conclusions were based partly on a modified version of the required velocity model of catching (C. E. Peper, R. J. Bootsma, D. R. Mestre, & F. C. Bakker, 1994). The present article considers this optical ratio in detail and asks whether it, together with a control law, predicts the (often curious) hand trajectories observed in lateral interception. The optical ratio was used to create a succession of target-position inputs for the vector integration to endpoint model of hand movements (D. Bullock & S. Grossberg, 1988). The model used this succession, initial hand position, and model parameters (fit to 60 trials) to predict hand trajectories on each trial. Predicted trajectories were then compared with observed hand trajectories. Hand movements were predicted accurately, especially in the binocular condition, and were superior to predictions based on lateral ball position, the input variable of the required velocity model. The authors concluded, as did C. E. Peper et al. (1994), that perceivers continuously couple movements to optics.  相似文献   

15.
Co-thought gestures are hand movements produced in silent, noncommunicative, problem-solving situations. In the study, we investigated whether and how such gestures enhance performance in spatial visualization tasks such as a mental rotation task and a paper folding task. We found that participants gestured more often when they had difficulties solving mental rotation problems (Experiment 1). The gesture-encouraged group solved more mental rotation problems correctly than did the gesture-allowed and gesture-prohibited groups (Experiment 2). Gestures produced by the gesture-encouraged group enhanced performance in the very trials in which they were produced (Experiments 2 & 3). Furthermore, gesture frequency decreased as the participants in the gesture-encouraged group solved more problems (Experiments 2 & 3). In addition, the advantage of the gesture-encouraged group persisted into subsequent spatial visualization problems in which gesturing was prohibited: another mental rotation block (Experiment 2) and a newly introduced paper folding task (Experiment 3). The results indicate that when people have difficulty in solving spatial visualization problems, they spontaneously produce gestures to help them, and gestures can indeed improve performance. As they solve more problems, the spatial computation supported by gestures becomes internalized, and the gesture frequency decreases. The benefit of gestures persists even in subsequent spatial visualization problems in which gesture is prohibited. Moreover, the beneficial effect of gesturing can be generalized to a different spatial visualization task when two tasks require similar spatial transformation processes. We concluded that gestures enhance performance on spatial visualization tasks by improving the internal computation of spatial transformations. (PsycINFO Database Record (c) 2010 APA, all rights reserved).  相似文献   

16.
A beneficial effect of gesture on learning has been demonstrated in multiple domains, including mathematics, science, and foreign language vocabulary. However, because gesture is known to co‐vary with other non‐verbal behaviors, including eye gaze and prosody along with face, lip, and body movements, it is possible the beneficial effect of gesture is instead attributable to these other behaviors. We used a computer‐generated animated pedagogical agent to control both verbal and non‐verbal behavior. Children viewed lessons on mathematical equivalence in which an avatar either gestured or did not gesture, while eye gaze, head position, and lip movements remained identical across gesture conditions. Children who observed the gesturing avatar learned more, and they solved problems more quickly. Moreover, those children who learned were more likely to transfer and generalize their knowledge. These findings provide converging evidence that gesture facilitates math learning, and they reveal the potential for using technology to study non‐verbal behavior in controlled experiments.  相似文献   

17.
When people talk, they gesture. We show that gesture introduces action information into speakers' mental representations, which, in turn, affect subsequent performance. In Experiment 1, participants solved the Tower of Hanoi task (TOH1), explained (with gesture) how they solved it, and solved it again (TOH2). For all participants, the smallest disk in TOH1 was the lightest and could be lifted with one hand. For some participants (no-switch group), the disks in TOH2 were identical to those in TOH1. For others (switch group), the disk weights in TOH2 were reversed (so that the smallest disk was the heaviest and could not be lifted with one hand). The more the switch group's gestures depicted moving the smallest disk one-handed, the worse they performed on TOH2. This was not true for the no-switch group, nor for the switch group in Experiment 2, who skipped the explanation step and did not gesture. Gesturing grounds people's mental representations in action. When gestures are no longer compatible with the action constraints of a task, problem solving suffers.  相似文献   

18.
Gestures are pervasive in human communication across cultures; they clearly constitute an embodied aspect of cognition. In this study, evidence is provided for the contention that gestures are not only a co-expression of meaning in a different modality but also constitute an important stepping stone in the evolution of discourse. Data are provided from a Grade 10 physics course where students learned about electrostatics by doing investigations for which they constructed explanations. The data show that iconic gestures (i.e. symbolic hand movements) arise from the manipulation of objects (ergotic hand movements) and sensing activity (epistemic hand movements). Gestures not only precede but also support the emergence of scientific language. School science classes turn out to be ideal laboratories for studying the evolution of domain ontologies and (scientific) language. Micro-analytic studies of gesture–speech relationships and their emergence can therefore serve as positive constraints and test beds for synthetic models of language emergence.  相似文献   

19.
Non-communicative hand gestures have been found to benefit problem-solving performance. These gestures seem to compensate for limited internal cognitive capacities, such as visual working memory capacity. Yet, it is not clear how gestures might perform this cognitive function. One hypothesis is that gesturing is a means to spatially index mental simulations, thereby reducing the need for visually projecting the mental simulation onto the visual presentation of the task. If that hypothesis is correct, less eye movements should be made when participants gesture during problem solving than when they do not gesture. We therefore used mobile eye tracking to investigate the effect of co-thought gesturing and visual working memory capacity on eye movements during mental solving of the Tower of Hanoi problem. Results revealed that gesturing indeed reduced the number of eye movements (lower saccade counts), especially for participants with a relatively lower visual working memory capacity. Subsequent problem-solving performance was not affected by having (not) gestured during the mental solving phase. The current findings suggest that our understanding of gestures in problem solving could be improved by taking into account eye movements during gesturing.  相似文献   

20.
The authors investigated whether movement-planning and feedback-processing abilities associated with the 2 hand-hemisphere systems mediate illusion-induced biases in manual aiming and saccadic eye movements. Although participants' (N = 23) eye movements were biased in the direction expected on the basis of a typical Müller-Lyer configuration, hand movements were unaffected. Most interesting, both left- and right-handers' eye fixation onset and time to hand peak velocity were earlier when they aimed with the left hand than they were when they aimed with the right hand, regardless of the availability of vision for online movement control. They thus adapted their eye-hand coordination pattern to accommodate functional asymmetries. The authors suggest that individuals apply different movement strategies according to the abilities of the hand and the hemisphere system used to produce the same outcome.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号