Font Size: a A A

The Application Of Several Linear And Nonlinear Feature Extraction Methods To Face Recognition

Posted on:2005-04-17Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y XuFull Text:PDF
GTID:1118360152465787Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
Automation for face recognition is one of the most significant requests, although it is also one of the most challenging tasks. During past several decades great progress has been made in research on this subject. However, it is far away from satisfactory requirements from real world. Complex illumination, variable expression and non-fixed pose complicate automation recognition for face. In area of theory research, researchers attempt to work out algorithms and methods to recognize faces with high right classification rate and good efficiency. Among them, subspace method, Fisher discriminant analysis, nonlinear algorithms mainly appearing as support vector machine and genetic algorithm are often applied. On application in face recognition of linear method based on Fisher criterion, Zhong Jin, Jian Yang and Jing-Yu Yang have done much significant work. Their researches mainly focus on effectiveness of algorithms, and meaningful results are obtained. On the other hand, in real application efficiency is also an important indicator to assess one algorithm, because in many cases only algorithms with high efficiency can satisfy request of real task. This paper aims at designing algorithms on face recognition, including linear algorithms and nonlinear ones. It is expected that these algorithms must be very efficient, besides effective.It is demonstrated that Fisher criterion value of one F-S discriminant vector will not be less than that of corresponding Fisher discriminant vector. The demonstration explains relative experimental results in previous references. It is showed that uncorrelated discriminant vector is identical to classic Fisher discriminant vector. Furthermore, a new algorithm based on Fisher criterion is developed.Kernel Fisher discriminant analysis is discussed in detail. In this method classification for one test sample depends on kernel functions between the test sample and all the training samples. So classification efficiency will descend while the training samples increase. Idea that some linear combination of a part of training samples, called "significant nodes", can approximate discriminant vector in feature space is proposed. According to this idea, procedure for selecting "significant nodes" is developed and optimized algorithm for kernel Fisher discriminant analysis is achieved. In the optimization, one test sample can be classified only depending on kernel function between the test sample and "significant nodes". Experimental results on benchmark datasets and face image database show that "significant nodes" are much less than total training samples. As a result, for optimization algorithm, efficiency on feature extraction will be very high. It is very significant, especially for real application with high efficiency request. Moreover, right classification rateachieved by optimization algorithm is comparative to naive KFD. In addition, optimization schemes are designed for two-class classification and multi-class classification respectively.For two-class problem, one linear algorithm based on linear equations using kernel function with suitable class labels is equivalent to KFD. Further more, given a proper criterion based on classification error, the linear algorithm is optimized in efficiency. The optimized algorithm can be extended into multi-class classification. Experiments on two-class and multi-class show that this algorithm is very effective and efficient. Difference between this optimization scheme and the former on KFD based on Fisher criterion is also discussed.Primary component analysis is one of methods widely used. However, efficiency of feature extraction based on kernel PCA is also decided by the size of training samples. An optimization scheme on KPCA is designed to improve efficiency. By this scheme, feature extraction for test samples can be implemented depending on only a part of training samples. For benchmark datasets this scheme obtains very good results.The above algorithms and schemes correspond to good classification results and improvement in efficiency. It is notabl...
Keywords/Search Tags:Feature Extraction, FDA, Kernel PCA, Kernel NDA, Face Recognition under Varying Illumination
PDF Full Text Request
Related items