Font Size: a A A

The Research Of Discriminant Feature Extraction Methods And Application In Face Recognition

Posted on:2009-01-27Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y S LinFull Text:PDF
GTID:1118360245479304Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Feature extraction is one of the elementary problems in the area of pattern recognition. It is the key to the classifier problems such as face identification. In the passed decade years, many correlated algorithms have been proposed to solve this problem. For example, linear discriminant analysis (LDA), principal component analysis (PCA) and independent component analysis (ICA) are developed to solve linear problem, and kernel methods based on support vector machine (SVM) are proposed to solve nonlinear problem. In this paper, linear and nonlinear methods on feature extraction field are both deeply analyzed. The proposed algorithms can be sure successfully applied on the face recognition.Foley-Sammon discriminant analysis (FSD) is an efficient linear feature extraction method. However, its calculation procedure for discriminant vectors is extraordinarily time-consuming and the method doesn't consider the discriminant information contained in the null space of the within-class scatter matrix, yet this discriminant information is very effective for recognition problem. In this paper, a complete Fisher discriminant analysis with Schur decomposition approach is proposed. The method simply obtains a set of the optimal discriminant vectors meeting orthogonality constraints and Fisher criterion, at the same time, information of null space of the within-class scatter matrix is introduced in this paper. Experimental results on ORL and FERET face database indicate that the proposed method is valid.Making use of the statistical uncorrelated projection space, a new method of statistically uncorrelated optimal discriminant vectors is presented in this paper based on the maximum scatter difference discriminant criterion. The uncorrelated optimal discriminant vectors are obtained by resolving the orthogonal vectors based on maximum scatter difference discriminant criterion in the uncorrelated projection space. The purpose of the method is to maximum the inter-class scatter while simultaneously minimizing the intra-class scatter after the projection, and eliminate the statistically correlation between features. Besides, this paper reveals the relation between the maximum scatter difference discriminant criterion and Fisher criterion for feature extraction. Experimental results on ORL and NUST603 face database show the effectiveness of the proposed algorithm. The recognition rate of the method is superior to MSDC and PCA.Two-dimensional linear discriminant analysis (2DLDA) is an efficient linear feature extraction method. But the projective vectors of 2DLDA only reflect variations between rows of images and variations between columns of images are omitted, while the omitted variations between columns of images are usually also useful for recognition. Therefore recognition performance of 2DLDA is affected. To solve the problem, diagonal linear dicriminant analysis (DiaLDA) was proposed in this paper. Experimental results on ORL and FERET face database demonstrate the proposed algorithm is superior to 2DLDA method and some existing well-known methods.In many real-world applications the distributions of original samples are usually highly complex and nonlinear. The result of classifying the samples is not satisfying by using conventional linear discriminant analysis such as FDA and PCA. Based on the kernel trick in Support Vector Machine(SVM), a kind of nonlinear feature extraction technique is presented finally in this paper. In the new approach, the kernel trick is used firstly to project the original samples into an implicit space called feature space by nonlinear kernel mapping, then two equivalent models based on Fisher discriminant minimal criterion by the theory of Reproducing Kernel in the feature space, and the optimal discriminant vectors are solved finally by using Fisher discriminant minimal criterion. The proposed algorithm was tested and evaluated on the ORL and the NUST603 face database. The experimental results show that the proposed method is valid.Face recognition is one of the hottest research areas in pattern recognition. Many face recognition methods have been proposed. Recently, a lot of learning algorithms have been proposed and applied it in face recognition tasks successfully. Among them, locality preserving projections (LPP) is one of the most effective methods. In this paper, we propose a new face recognition method—Orthogonal Discriminant Locality Preserving Projections with Schur decomposition (ODLPPS).In comparison with LPP, the objective function of the proposed method incorporates scatter difference information of between-class and within-class and makes the basic vectors orthogonal. Experimental results on ORL,Yale and FERET demonstrate the proposed algorithm achieves better face recognition performance than some existing methods such as eigenface,Fisherface,LPP and Orthogonal LPP(OLPP).
Keywords/Search Tags:Pattern Recognition, Feature Extraction, Principal Component Analysis, Fisher Linear Dscriminant Analysis (LDA or FDA), Face Recognition, Locality Preserving Projections (LPP), Kernel Trick, Manifold Learnin
PDF Full Text Request
Related items