Font Size: a A A

Research On Variational Bayesian Learning And Inference For Bayesian Network Models

Posted on:2016-12-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:C ShenFull Text:PDF
GTID:1318330542474089Subject:Navigation, guidance and control
Abstract/Summary:PDF Full Text Request
Bayesian networks(BNs)refers to a class of graphical models that interpret many uncertain problems arising from kinds of real applications via graphs and represent the dependencies between variables in the graphs in a probabilistic manner.Nowadays,most popular topics around BNs are learning and inference.Variational Bayes(VB)is a deterministic approximate Bayesian method which stemmed from machine learning community in late 90's.In its framework,the difficulty in computing the posterior probability density function using Bayes' theorem is circumvented by finding the extremum of a functional.Owing to its proper estimation precision and low computational demanding,VB has been extensively used in learning and inference for variaous graphical models.Thus,there are good prospects for the applications of VB.The main work of this thesis is applying VB to learn and infer for a couple of typical commonly used BNs.For the mixture of Gaussians(MoG)which is a static BN,an elaborate VB learning and inference algorithm for MoG is derived.Iterated parameters learning and inference for the latent variable are thoroughly discussed together with VB structural learning for determining model complexity.However,the convergence of VB to a local extremum is inevitable and sampling process itself might be noisy.Simulating the annealing process of a material,the inverse temperature parameter is introduced to let VB converge to the global extremum.By rebuilding MoG,the uncertainty of each sampled data is taken into account.Then a robust annealing VB algorithm for MoG is designed and derived for improving the robustness and convergence performance of the original method.A typical dynamic BN,the Gaussian state space model(SSM)is considered.In nonlinear filtering problems,the time-variant variances of measurement noise are recognized as random variables as the state.By using VB and Bayesian optimal filtering,a noise adaptive nonlinear filtering algorithm is derived.For the linear SSM,the colored measurement noise is assumed time-variant and the work mentioned above is extended to the colored noise case.The model is modified to accommodate white noise case via differencing technique.VB is further used to learn the equivalent covariance online,meanwhile,the state is recursively estimated.In the case both process noise and measurement noise have time-variant covariances and non-zero means,inverse-Wishart and Gaussian are used to model the covariance and mean of the measurement noise,respectively.VB learns those two distribution parameters of the measurement noise and infers the state recursively.The process noise statistics is timely updated as well.For the non-Gaussian SSM which is also a dynamic BN,the Student-t distribution is used to model the non-Gaussian distribution throughout this chapter.For the highly nonlinear non-Gaussian SSM,applying marginalized particle filtering marginalizes out distributional parameters of the Student-t noise.Those parameters are regarded as random variables with prior information and VB is used to learn them online.After obtaining the sufficient statistics,the state can be inferred via importance sampling method.For the hybrid system which contains several possibly switching non-Gaussian SSMs,let the latent variable in the Student-t augement the state.In the proposed interacting multiple models method,the prior of the Gamma distributed variable could be computed for each sub-filter by using the moment matching method for the mixture of Gammas.At filtering stage,VB is employed to update the noise statistics and the state of each sub-filter is also approximated.Those estimated states are combined at the final stage.A complex BN consisting of the mixture model and SSM is concerned.To learn and infer the non-Gaussian SSM where the mixture of Student-t s is adopted to model the measurements,Dirichlet process mixture is introduced to form the infinite mixture to further enhance the robustness of the model.In VB expectation step,Kalman smoothering for SSM is approximated.In VB maximization step,the parameters and structure of the mixture model are learned.Hence,the optimal offline estimated results are obtained in the end.For the SSM where the state transition model is non-Gaussian,a Gaussian sum variational filtering algorithm is put forward.The non-Gaussian state is modeled as the MoG.For each Gaussian distributed state,the mean and precision matrix are regarded as Gaussian and Wishart distribution,respectively.VB is utilized to iteratively learn the parameters of the mean,precision matrix and weighting coefficients to capture the uncertainty of the state and determine the weights of the Gaussians.Further,all the states are inferred by using importance sampling from their corresponding expected posterior state distributions parameterized by those variational parameters.
Keywords/Search Tags:Bayesian network, learning, inference, variational Bayes
PDF Full Text Request
Related items