Thathachar, MAL and Arvind, MT (1999) Global Boltzmann Perceptron Network for On-line Learning of Conditional Distributions. In: IEEE Transactions on Neural Networks, 10 (5). pp. 1090-1098.
This paper proposes a backpropagation-based feedforward neural network for learning probability distributions of outputs conditioned on inputs using incoming Input-output samples only. The backpropagation procedure is shown to locally minimize the Kullback-Leibler measure Sn an expected sense, The procedure is enhanced to facilitate boundedness of weights and exploration of the search space to reach a global minimum. Weak convergence theory is employed to show that the longterm behavior of the resulting algorithm can be approximated by that of a stochastic differential equation, whose invariant distributions are concentrated around the global minima of the Kullback-Leibler measure within a region of interest. Simulation studies on problems involving samples arriving from a mixture of labeled densities and the well-known Iris data problem demonstrate the speed and accuracy of the proposed procedure.
|Item Type:||Journal Article|
|Additional Information:||Copyright 1999 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.|
|Keywords:||Boltzmann perceptron network;Conditional distributions;Global convergence;Kullback-Leibler measure;Mixture densities;On-line learning|
|Department/Centre:||Division of Electrical Sciences > Electrical Engineering|
|Date Deposited:||06 Mar 2006|
|Last Modified:||19 Sep 2010 04:15|
Actions (login required)