Font Size: a A A

Estimation Of Full Convariance GMM Model Based On Gaussian Convolution

Posted on:2023-07-01Degree:MasterType:Thesis
Country:ChinaCandidate:Q LiFull Text:PDF
GTID:2530306842967959Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Gaussian mixture model(Gaussian Mixture Model,GMM)is a kind of probability density model widely used in data classification.In application,the number of components in GMM model needs to be considered first,which is usually very difficult,especially in the face of high-dimensional data.When the number of components is selected,the model parameter estimation generally adopts iterative algorithm,and the model parameters need to be initialized before iteration.Existing research shows that iteration is sensitive to parameter initialization.Improper initialization parameters are easy to make the iterative algorithm fall into local optimization,and even make the iteration divergent.Aiming at the existing problems,this paper proposes a GMM estimation algorithm based on Gaussian convolution and the principle of small probability event.The main work is as follows:(1)Considering the Gaussian convolution of multivariate normal distribution density function under the background of full covariance,some results of Fourier transform and Gaussian convolution of normal distribution density function are obtained,and then the diagonal covariance GMM model based on Gaussian convolution is extended to the full covariance GMM model(Gaussian Convolution-Gaussian Mixture Model,GC-GMM model).Considering the covariance matrix of Gaussian convolution kernel function as scale factor,GC-GMM model is essentially a GMM model with multiscale structure.It provides a new idea for exploring data distribution structure and designing GMM estimation algorithm.(2)GC-SPEP algorithm for GMM estimation.The GC-SPEP algorithm proposed in this paper consists of three sub algorithms: kernel function selection algorithm based on Gaussian convolution data transform(Gaussian Convolution Based Data Transform,GCDT),GMM initial estimation algorithm based on conditional probability exclusion(Conditional Probability Exclusion,CPE)and GMM optimization algorithm based on the principle of small probability event(Small Probability Event Principle,SPEP).GC-SPEP corresponds to the problems in GMM application,which automatically adjusts the number of principal components and estimation parameters in the optimization process,is a kind of data-driven intelligent algorithm.(3)Simulation analysis of GC-SPEP algorithm.Through four typical simulation models,we find the following facts: firstly,the computational efficiency advantage of GC-SPEP algorithm is obvious.When the number of components,sample size or data dimension of the simulation model increases,the number of iterations required by the algorithm is less than 5,while the maximum number of iterations required by other algorithms such as classical EM is much higher than 5;Secondly,an example shows that the classification effect of the GC-SPEP algorithm proposed in this paper is better than that proposed by Yang and Figueiredo;Thirdly,the actual estimation error of GC-SPEP algorithm is less than that of EM algorithm.Simulation analysis fully shows the effectiveness of the algorithm.
Keywords/Search Tags:GMM model, component selection, parameter estimation, Gaussian convolution, GC-GMM model, conditional probability, the principle of small probability event
PDF Full Text Request
Related items