Font Size: a A A

Variational Learning Approach For Non-Gaussian Statistical Model And Its Application

Posted on:2020-05-10Degree:MasterType:Thesis
Country:ChinaCandidate:N GaoFull Text:PDF
GTID:2370330575967961Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Currently,there are many methods for processing massive data information.Probabilistic mixture model is a very popular probabilistic tool for the analysis and modeling of data.Gaussian mixture model(GMM)has been widely used for modeling data due to its advantages of convenient calculation.GMM cannot safely model the bounded or semi-bounded data,such as text,image and so on.For these data,non-Gaussian probability statistical model shows better capabilities.This paper is devoted to study variational learning approach for non-Gaussian statistical models,and the extended variational learning framework can solve the model problem in image processing,intrusion detection and so on.Firstly,this paper briefly analyzes the research background and significance of the probabilistic mixture model,and introduces the status of non-Gaussian statistical model modeling.Meanwhile,the definition of probability mixture model is introduced,and several probability density function forms are listed.The common algorithms for estimating parameter in probabilistic mixture models,maximum likelihood and variational Bayesian method are introduced.Next,in this paper,the variational learning algorithm for Beta mixture model(BMM)is studied.BMM is a non-Gaussian statistical model which is used to model and analyze bounded data.However,it is difficult to estimate the parameters due to the complex Gamma function and its derivative integral expression in solving the problem of model parameters and model selection.This above problem can be solved by an efficient variational learning algorithm in most statistical models.The algorithm selects a simple distribution instead of a complex distribution,and a theorem with respect to the lower-bound of the BMM is proposed.The extended variational inference approach is applied to introduce a single lower-bound to the variational objective function,and maximize the model evidence lower-bound to estimate the parameters.Moreover,there is only one objective function that is maximized during iterations,and the convergence of the approach is theoretically guaranteed.The detailed deduction process is given in this paper,with synthesized data,the method is verified the feasibility of the algorithm.This paper gives a comparative experiment to further verify the efficiency of the method.Finally,this paper selects the academically recognized target image sets,Caltech4,ETH-80 and MIT Scene for the application of the algorithm,and the feature vector sets are obtained by the RHOG.The efficient variational learning algorithm for Beta mixture model proposed in this paper is applied to classify.Through the experimental results,the algorithm can be applied to the target classification.
Keywords/Search Tags:Beta mixture models, Variational learning approach, Parameter estimation, Model selection, Object categorization
PDF Full Text Request
Related items