Font Size: a A A

Research Based On Locality Preserving Subspace Method With Its Application To Face Recognition

Posted on:2010-05-28Degree:MasterType:Thesis
Country:ChinaCandidate:C L ZhouFull Text:PDF
GTID:2178360302466118Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Face recognition is one of the hottest research topics in pattern recognition. In practical applications, the faces are high-dimensional, which will encounter'the curse of dimensionality', and in such cases, there is a great need to use dimensionality reduction method to extract features. That is to say, the high dimensional data are mapped into lower dimensional ones, meanwhile the discriminant information are preserved as much as possible, which helps for classification.Of the algorithms, due to low computations and high recognition rate, subspace based method have obtained great attention over the past few years. And at present, the most famous subspace based algorithms are PCA, LDA and LPP, when applied in face recognition, they are called Eigenfaces, Fisherfaces and Laplacianfaces respectively. PCA is performed through solving the eigen-problem for the training samples'covariance, and some orthogonized vectors are given which are much less than the dimensionality of the samples to present the training samples. From the point of linear construction, PCA is best for representing samples. PCA is an unsupervised algorithm, that is to say, it does not consider the difference between interclass and intraclass, so it is not beneficial for classification. LDA based face recognition, which is a supervised algorithm and after mapping the within-class scatters are minimized, and the between-class scatters are maximized, so it is beneficial for classification and performs better than PCA. But in LDA it will encounter the small sample problems and it requires that the data are Gaussian distribution which is not the case in practical applications.Both of PCA and LDA assume the face resides in the Euclidean space, but recently many research demonstrated that the face in fact lies in some kind of nonlinear sub-manifolds. Up to now, there are two kinds of nonlinear dimensionality reduction algorithms; one is based on kernel trick, like KPCA and KLDA. The kernel method is based on the notion that first the data are mapped to a higher dimensional space, which makes the data can be linearly separated while in the original space it can only be nonlinearly separated which means the patterns in higher dimensionality are more tight, and then PCA or LDA are performed in this high dimensional space. To some extend, kernel method can solve the nonlinearity, but how to choose the kernel function is a problem and it ignores the manifold structure of the face. The other is based on manifold learning, like ISOMPA, Locally Linear Embedding (LLE) and Laplacian Embedding (LE). All these algorithms have good performance for visualization, but there are problems, like the computations is expensive, and it will encounter the out of sample problem, which renders it not optimal for classification.To solve these problems, HE proposed a linear version of LE, called locality preserving projection (LPP), compared with manifold learning, it can solve the out of sample problem and it is an excellent algorithm. LPP based algorithm, which is also an unsupervised method, but it takes the manifold of the faces into consideration, performs better that PCA and LDA. In this thesis, based on locality preserving projection, two improved algorithm are proposed and the performance are tested in face recognition.Firstly, some basics about face recognition are introduced and surveyed. Of the algorithms are classified as follows: geometric characteristic–based algorithm, subspace-based algorithm, elastic graph matching-based algorithm, neuronetwork-based algorithm, model matching-based algorithm, Hidden Markov Model-based algorithm, and Bayesian-based algorithm. And the subspace-based algorithms are comprehensively surveyed, the supervised method like LDA performs better than unsupervised method like PCA, and they all only consider in the Euclidean space, as a result, the performance is much less admirable than LPP. Bur LPP, as an unsupervised method, has some drawbacks, and some researchers have improved it, like the discriminant locality preserving projection DLPP, or the orthogonized locality preserving projection and so on.Secondly, the dimensionality reduction algorithms are studied, which are classified into three kinds, 1) linear methods: like PCA and LDA, 2) nonlinear methods, like manifold learning and methods based on kernel trick, In manifold learning, ISOMAP, LLE and LE are introduced in detail, in kernel–based method, KPCA and KLDA are specifically introduced. 3) the linear version of manifold learning, like LPP.Lastly, based on these, two algorithms called path-similarity based locality preserving projection and kernel discriminant locality preserving maximum margin algorithm are proposed. The former not only considers the discriminant information of the data, but also a robust method is used to describe the weight for adjacent data, as a result, the proposed algorithm is beneficial for classification and somehow robust to noise. Specifically, the proposed algorithm gives a more effective way to measure the similarity of samples, that is the path-based similarity, which reflect the relationships much more effective and more robust to noise. The proposed algorithm can not only preserve the locality like in LPP, but also takes the class information into consideration, so it achieves admirable recognition effect.The kernel discriminant locality preserving maximum margin algorithm, on one hand, the local structure of the data are taken into account, on the other, the class information are also considered and it is a nonlinear method by kernel trick. Specifically, the proposed algorithm can preserving the local structure of data like in LPP, but it solves the drawback in LPP, that is ,LPP does not consider the discriminant information and it is not beneficial for classification. The proposed algorithm takes the class information into account and the separatability are enhanced, meanwhile the kernel trick is employed so as that it can handle the nonlinearity, and it gains admirable performance.Experiments show the correctness and effectiveness of the proposed algorithms, and demonstrate that they have higher recognition rate than the classical PCA, LDA and LPP.
Keywords/Search Tags:feature extraction, locality preserving projection, face recognition, data dimensionality reduction
PDF Full Text Request
Related items