首页 | 本学科首页   官方微博 | 高级检索  
   检索      


Semantics boosts syntax in artificial grammar learning tasks with recursion
Authors:Fedor Anna  Varga Máté  Szathmáry Eörs
Institution:Department of Plant Systematics, Ecology and Theoretical Biology, E?tv?s Loránd University of Sciences, Budapest, Hungary. fedoranna@gmail.com
Abstract:Center-embedded recursion (CER) in natural language is exemplified by sentences such as "The malt that the rat ate lay in the house." Parsing center-embedded structures is in the focus of attention because this could be one of the cognitive capacities that make humans distinct from all other animals. The ability to parse CER is usually tested by means of artificial grammar learning (AGL) tasks, during which participants have to infer the rule from a set of artificial sentences. One of the surprising results of previous AGL experiments is that learning CER is not as easy as had been thought. We hypothesized that because artificial sentences lack semantic content, semantics could help humans learn the syntax of center-embedded sentences. To test this, we composed sentences from 4 vocabularies of different degrees of semantic content due to 3 factors (familiarity, meaning of words, and semantic relationship between words). According to our results, these factors have no effect one by one but they make learning significantly faster when combined. This leads to the assumption that there were different mechanisms at work when CER was parsed in natural and in artificial languages. This finding questions the suitability of AGL tasks with artificial vocabularies for studying the learning and processing of linguistic CER.
Keywords:
本文献已被 PubMed 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号