Font Size: a A A

Dimensionality Reduction Method Based On Local Geometric Relations And Its Application To Face Recognition

Posted on:2009-08-24Degree:MasterType:Thesis
Country:ChinaCandidate:Y L YuFull Text:PDF
GTID:2208360272459932Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
As long as science and technology develops, the ability of people (computer) to process data becomes stronger and stronger. In the meantime, the dimensionality and amount of data grows to incredibly high. In many cases, such as face recognition, gene expression analysis, etc, we first need to reduce the dimensionality of data because on one hand we may avoid the "curse of dimensionality" and lower the computational and storage cost, one the other hand, it helps to discover the intrinsic structure of data distribution.Recently, inspired by manifold learning, new dimensionality reduction methods based on static weighting are popular in face recognition. The main principle is to guide dimensionality reduction by locality. Although these methods have achieved success in many aspects, there are still some problems to be solved. Specifically, how to constrain the base vectors in linear dimensionality reduction? How to efficiently and automatically determine the weights? Besides, traditional methods are inclined to be dominated by some particular instances, however, static weighting strategy is not adequate to solve this. How can one deal with this problem?Focused on the above problems, we did some research work in this thesis. Our contributions are shown as follows:1. We proposed Pseudo Inverse Extension to generalize manifold learning methods to the test set. By choosing some appropriate positive definite kernels, our method is guaranteed to obtain the same result as the original manifold learning algorithms in the training set.2. Based on the recent nonparametric linear dimensionality reduction method, Marginal Fisher Analysis (MFA), we proposed to add orthogonal (OMFA) and uncorrelated (UMFA) constraints on the bases. We proved that OMFA and UMFA are always superior to MFA from both theory and experiment.3. Traditional dimensionality reduction methods, such as LDA, are inclined to be dominated by some particular instances. We thoroughly analyzed this phenomenon, and proposed a new method, which is called Dynamic Nearest Center Repulsion (DNCR), to overcome this problem. The final algorithm is directly optimized in Grassmann manifold, which results in more efficiency.4. By appropriate relaxation, we transform the non-convex problem in DNCR to a convex one, thus completely prune out local optima. We apply this method into metric learning, and interestingly find that its dual problem just corresponds to automatically learn the weights in MFA.
Keywords/Search Tags:Locality, Dimensionality Reduction, Face Recognition
PDF Full Text Request
Related items