首页 | 本学科首页   官方微博 | 高级检索  
     


Learning Orthographic Structure With Sequential Generative Neural Networks
Authors:Alberto Testolin  Ivilin Stoianov  Alessandro Sperduti  Marco Zorzi
Affiliation:1. Department of Developmental Psychology and SocialisationUniversity of Padova;2. Department of General PsychologyUniversity of Padova;3. Cognitive Psychology LaboratoryCNRS & Aix‐Marseille University;4. Department of MathematicsUniversity of Padova;5. Center for Cognitive NeuroscienceUniversity of Padova;6. IRCCS San Camillo Neurorehabilitation Hospital
Abstract:Learning the structure of event sequences is a ubiquitous problem in cognition and particularly in language. One possible solution is to learn a probabilistic generative model of sequences that allows making predictions about upcoming events. Though appealing from a neurobiological standpoint, this approach is typically not pursued in connectionist modeling. Here, we investigated a sequential version of the restricted Boltzmann machine (RBM), a stochastic recurrent neural network that extracts high‐order structure from sensory data through unsupervised generative learning and can encode contextual information in the form of internal, distributed representations. We assessed whether this type of network can extract the orthographic structure of English monosyllables by learning a generative model of the letter sequences forming a word training corpus. We show that the network learned an accurate probabilistic model of English graphotactics, which can be used to make predictions about the letter following a given context as well as to autonomously generate high‐quality pseudowords. The model was compared to an extended version of simple recurrent networks, augmented with a stochastic process that allows autonomous generation of sequences, and to non‐connectionist probabilistic models (n‐grams and hidden Markov models). We conclude that sequential RBMs and stochastic simple recurrent networks are promising candidates for modeling cognition in the temporal domain.
Keywords:Connectionist modeling  Recurrent neural networks  Restricted Boltzmann machines  Probabilistic graphical models  Generative models  Unsupervised learning  Statistical sequence learning  Orthographic structure
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号