Font Size: a A A

Study And Implementation Of Bayesian Network Parameter Learning Algorithm

Posted on:2019-01-27Degree:MasterType:Thesis
Country:ChinaCandidate:J N LeiFull Text:PDF
GTID:2428330572452127Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Bayesian network has been widely used to solve the problem of uncertainty since it's presentation.In recent years,Bayesian network has been gradually applicated to the study of the data law characterization because of it's simplicity of modeling and convenience for calculating.The study of bayesian network is divided into two directions,namely,parameter learning and structure learning.This paper mainly studies the former.Parameter learning means learning to compute network parameters through the analysis of the given bayesian network structure.During actual observation,the value of node samples required by parameter learning is easy to be lost,which greatly increases the difficulty of parameter learning.In order to solve the problem of parameter learning under the condition of sample deletion,the commonly used algorithms are EM algorithm and Gibbs sampling algorithm.However,the E step of the classical EM algorithm is more complex for the expected calculation,and the learning efficiency tends to decline when facing complicated bayesian network.The classical Gibbs sampling algorithm is deficient in convergence,and it may take a lot of time in actual operation.Point at above questions,this paper proposes methods to improve classical EM algorithm and Gibbs sampling algorithm.Firstly,this paper recommends the theoretical basis of probability theory,then introduces the concept of bayesian network and the classical algorithm used in parameter learning.Secondly,the application scenarios of parameter learning using EM algorithm and Gibbs sampling under value missing bayesian network are given.In this paper,we propose the concept of weight value to simplify the calculation of EM algorithm E step and further use the Gibbs sampling algorithm to replace the E step of EM algorithm,and improve the operation efficiency of the algorithm under the condition of guaranteed accuracy.By combining Gibbs sampling with bayesian network and setting the transfer probability generated in the sampling process as the weight,the learning accuracy is greatly improved.Consider the bayesian network with small samples,this paper combines expert prior knowledge to give a bayesian network parameter learning method which fuses expert prior knowledge.Through the example verification,this method can make full use of the expert prior knowledge,overcome the difficulty in obtaining sample data under practical application conditions,and improve the accuracy of parameter learning.Finally,based on the theoretical basis and algorithm improvement,a bayesian network parameter learning library BPLlib is implemented.BPLlib has three layers,the test component combines the basic components and core algorithm components,which realize and test the improved algorithm.The learning accuracy and computational efficiency of the improved EM algorithm and Gibbs sampling algorithm are analyzed in this paper.At the same time,the suitable learning scenarios of the two algorithms are given.
Keywords/Search Tags:Bayesian network, data deletion, parameter learning, EM, Gibbs Sampling, BPLlib
PDF Full Text Request
Related items