Font Size: a A A

Research On Bayesian Networks And Its Learning Algorithms

Posted on:2003-05-21Degree:DoctorType:Dissertation
Country:ChinaCandidate:B YueFull Text:PDF
GTID:1118360095951189Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
Utilizing the conditional independences among a set of random variables, Bayesian networks reduced the number of the parameters needed to encode the joint probability distribution of these variables. At the same time, the use of the graphs to express conditional independences is straightforward and the effective algorithms for the probability inference in Bayesian networks can be obtained by the transformation of the graphs. Some study of Bayesian networks is performed in this dissertation and the work can be summarized as follows:1. Owing to the complexity of the probability inference in Bayesian networks is NP, some approximating methods are necessary in practice. Based on the method of model simplification, we present the optimal approximating of Bayesian networks by arc removal. First, the optimal approximating network is presented and the invariance property of the marginal distribution of some variables is studied. Then, an algorithm is proposed which calculates the error of removing multiple arcs and searches arcs to be removed if some error bounds is specified.2. A Markov chains Monte Carlo(MCMC) method for learning Bayesian networks is presented. MCMC can be used to explore the posterior probability produced by the Bayesian learning method. We used MCMC as a stochastic search strategy because it can search toward the models which have large posterior probabilities. The good performance of the method is illustrated by the learning of the Alarm network.3. The mixtures of factor analyzers is a Bayesian network with special structures. A two stage algorithm for the parameters learning of the mixtures of factor analyzers is presented, which first approximates the probability distribution of the data by the Gaussian mixture models, and then performs factor analysis for each Gaussians. The two stage algorithm has the benefit of lessen the strength of the coupling between the parameters of the Gaussian mixture model and the factor analysis. We also proposed two new methods which produce new data for each of the Gaussians and can reduce the data to be processed by the factor analysis by resampling and samples partition.4. Based on Gaussian mixture model, a new method for independentcomponent analysis(ICA) is proposed. By first approximating the probability density of the data, a stochastic gradient method is presented and the minimum entropy method which can separate the independent components one by one is obtained if data is whitened. A new measure of the non-Gaussianity named joint entropy is proposed and the minimum joint entropy method is obtained. We also prove the joint diagonalization of the covariance matrices of the Gaussians in the Gaussian mixture model of the pdf of the data can also obtain the unmixing matrix. Finally, by showing the problem of the joint diagonalization of real symmetric positive definite matrices is equivalent to a ICA problem, a new algorithm for the joint diagonalization is presented.
Keywords/Search Tags:graph model, conditional independence, Bayesian networks, model simplification, MCMC, Gaussian mixture model, factor analysis, mixtures of factor analyzers, EM algorithm, independent component analysis, entropy, joint diagonalization
PDF Full Text Request
Related items