首页 | 本学科首页   官方微博 | 高级检索  
     


Learning Continuous Probability Distributions with Symmetric Diffusion Networks
Authors:Javier R. Movellan  James L. McClelland
Abstract:In this article we present symmetric diffusion networks, a family of networks that instantiate the principles of continuous, stochastic, adaptive and interactive propagation of information. Using methods of Markovion diffusion theory, we formalize the activation dynamics of these networks and then show that they can be trained to reproduce entire multivariate probability distributions on their outputs using the contrastive Hebbion learning rule (CHL). We show that CHL performs gradient descent on an error function that captures differences between desired and obtained continuous multivariate probability distributions. This allows the learning algorithm to go beyond expected values of output units and to approximate complete probability distributions on continuous multivariate activation spaces. We argue that learning continuous distributions is an important task underlying a variety of real-life situations that were beyond the scope of previous connectionist networks. Deterministic networks, like back propagation, cannot learn this task because they are limited to learning average values of independent output units. Previous stochastic connectionist networks could learn probability distributions but they were limited to discrete variables. Simulations show that symmetric diffusion networks can be trained with the CHL rule to approximate discrete and continuous probability distributions of various types.
Keywords:
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号