Font Size: a A A

Generalizing Expectation Propagation With Mixtures Of Exponential Family Distributions

Posted on:2020-08-05Degree:MasterType:Thesis
Country:ChinaCandidate:S J HeFull Text:PDF
GTID:2428330596968138Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
A central task of Bayesian machine learning is to infer the posterior distribution of hidden random variables given observations and calculate expectations with respect to this distribution.However,this is often computationally intractable so that people have to seek approximation schemes.Deterministic approximate inference techniques are an alternative of the stochastic approximate inference methods based on numerical sampling,and during the last twenty years,many advancements in this field have been made.This paper proposed a new deterministic approximate inference method,called generalized EP(GEP).Expectation propagation(EP)is a widely used deterministic approximate inference algorithm in Bayesian machine learning.Traditional EP approximates an intractable posterior distribution through a set of local approximations which are updated iteratively.EP has excellent performance in many fields,but it also has some obvious defects.For example,the iteration of the EP algorithm does not guarantee convergence,cannot be extended to large-scale data and complex models,and causes excessive memory consumption in the local factor approximation process.In this paper,we propose a generalized version of EP called GEP,which is a new method based on the minimization of KL divergence for approximate inference.However,when the variance of the gradient of the objective function is large,the algorithm may need a long time to converge.We use control variates and develop a variance reduced version of this method called GEP-CV.The proposed approach provides faster convergence and better performance than other state-of-the-art approaches.The GEP algorithm iterative convergence and it is suitable for complex models.It is of practical significance to generalize the GEP algorithm in classical or popular models.This paper further proposes Bayesian logistic regression based on GEP and Bayesian neural network based on GEP.Since the GEP algorithm is an approximate inference method under the Bayesian framework,it is far superior to the popular deep learning tools in capturing the uncertainty of the model.Bayesian deep neural network is under consideration recently since Bayesian models provide a theoretical framework to infer model uncertainty.To obtain uncertainty estimates with real-world Bayesian deep learning models,practical inference approximations are needed.Dropout variational inference(VI)for example has been used for computer vision and medical applications,but dropout VI can severely underestimate model uncertainty.In this paper,we use GEP as an approximate Bayesian inference method for Bayesian deep neural networks.This alleviates the problem that deep learning methods cannot capture the model uncertainty without sacrificing either computational efficiency or test accuracy.
Keywords/Search Tags:Machine learning, Expectation propagation, Stochastic optimization, Variance reduction, Bayesian neural network
PDF Full Text Request
Related items