Font Size: a A A

Research On Gauss Mixture Clustering Algorithms In Image Retrieval

Posted on:2009-05-24Degree:MasterType:Thesis
Country:ChinaCandidate:S M SongFull Text:PDF
GTID:2178360242491860Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Gauss Mixture Model could be used to depict the sample data distribution in a parameterized way. Also, taking the Gauss mixture parameters as the feature of image brings in convenience and proficiency. Hierarchical image retrieval extracts the Gauss mixture parameters of image class from that of images by clustering algorithm. Hierarchical Gauss mixture clustering algorithm is efficient in reducing a large mixture of Gaussians into a smaller mixture while still preserving the component structure of the original mode. So that the most probable classes could be picked out at first, and then we only have to search the object pictures in those classes, which can reduce the computational complexity largely. And the most important part in hierarchical image retrieval is hierarchical clustering algorithm.Deem to serve for decision-making, clustering analysis is used to explore the inner structure of unlabelled data sets, which has been widely addressed in kinds of contexts and domains. By extending the classical ideas of clustering analysis, Gauss mixture clustering tries to cluster complex Gauss mixtures into simple ones with Gauss components being the basic units. And it is appropriate for large data set analysis in high dimension. The paper mainly discussed two recently introduced mixture clustering Algorithm below:The first is Hierarchical Expectation-Maximization (HEM) Algorithm proposed by N.Vasconcelos. Without consideration of covariance differences between mixture components, HEM, which is the extension of classical EM algorithm, usually leads to over expansion of some bigger components, that is, the components have bigger covariance, so the results will not reflects the inner structures of data sets well. Based on HEM, we proposed an improved Algorithm, named cov-HEM., Dispose of the traditional idea of split the components with bigger covariance, cov-HEM controls the over expansion those bigger components and reinforce the effects of smaller components by balancing the posterior probability with the introduction of a covariance coefficient.Another one is Agglomerative Information Bottleneck (AIB) Algorithm proposed by Slonim, which if the extension of the classical agglomerative clustering algorithm. But the Monte-Carlo simulation formula adopted in calculating the information loss is problematic, and we give the explicate proof in Chapter 4. Then we propose a solution by the introduction of probability dissimilarity measure. Also, we introduce the idea of parameters update used in HEM into AIB to calculate the parameters of the newGauss component generated by clustering.Other classical clustering algorithm could also be extended with the similar idea. Image retrieval experiments verified our minds.
Keywords/Search Tags:image retrieval, Gauss mixture model, hierarchical clustering, Hierarchical Expectation-Maximization, agglomerative Information Bottleneck
PDF Full Text Request
Related items