首页 | 本学科首页   官方微博 | 高级检索  
     


Discovering syntactic deep structure via Bayesian statistics
Authors:Jason Eisner
Affiliation:1. Institut Jean Nicod, Département D''études Cognitives, École Normale Supérieure, Université PSL, EHESS, CNRS, Paris, France;2. Institute for Advanced Study in Toulouse, Toulouse, France;1. Department of Cognitive Science, Central European University, Budapest, Hungary;2. Department of Philosophy, Università degli Studi di Milano Statale, Milan, Italy
Abstract:In the Bayesian framework, a language learner should seek a grammar that explains observed data well and is also a priori probable. This paper proposes such a measure of prior probability. Indeed it develops a full statistical framework for lexicalized syntax. The learner's job is to discover the system of probabilistic transformations (often called lexical redundancy rules) that underlies the patterns of regular and irregular syntactic constructions listed in the lexicon. Specifically, the learner discovers what transformations apply in the language, how often they apply, and in what contexts. It considers simpler systems of transformations to be more probable a priori. Experiments show that the learned transformations are more effective than previous statistical models at predicting the probabilities of lexical entries, especially those for which the learner had no direct evidence.
Keywords:Grammar induction  Bayesian learning  Transformational grammar  Lexicalized syntax
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号