Font Size: a A A

Feature Extraction And Classification Based On Supspace Analysing And Its Applications

Posted on:2013-06-08Degree:DoctorType:Dissertation
Country:ChinaCandidate:J XuFull Text:PDF
GTID:1228330395983723Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
Both feature extraction and classification techniques are two main hot branches in the field of pattern recognition. The aim of them is to recognize individual identities according to the effective information in images. In this paper, based on the subspace learning we use algebra statistics as our tool to develop some novel feature extraction techniques and classifiers. Furthermore, we compare these developed approaches with the current popular recognition algorithms and verify the effectiveness of our approaches. The main work and innovation of this dissertation are included as follows:(1) A sparse Fisher linear discriminative analysis (SFLDA) is proposed. Utilizing the equivalence of Fisher linear discriminant analysis (FLDA) and least squares Fisher linear regression (LSLR) on the binary-class recognition problem, we obtain the sparse discriminative vectors from solving the least squares optimization problem. The obtained sparse Fisher linear discriminative vectors can help us to find the main factors which affect the decision, and the psychological, physiological or physical interpretations from the sparse discriminative vectors. In addition, due to the fact that the sparse Fisher linear discriminative vectors are obtained from solving the least squares optimization imposed with the L1-norm constrain on coefficients, rather than solving the generalization eigen-equation,it can help us to save the time-costing.(2) A local graph embedding discriminant analysis is proposed for face recognition with single training sample per person. Due to the fact that only one training sample per class is avaliable, we present strategies to overcome this limitation from the following two aspects:one is to construct imitated training samples using the mean filter with the window of2x2; the other is to use the graph embedding to character the local data structure, rather than the global one. Based on above considerations of two aspects, the resulting local graph embedding discriminant analysis can successfully avoid the "small sample size problem", and the resulting recognition system become more stable and the corresponding recognition performance can be boosted a lot.(3) A de-correlated locality preserve projection (RLPP) is proposed. Based on the locality preserve projection (LPP) algorithm, we use the recursive method to obtain the de-correlated discriminative vectors, one by one. Unlike the existing the uncorrelated locality preserve projection (ULPP), the proposed RLPP present us with another simple but effective way in finding the de-correlated discriminative vectors. The existing ULPP and the proposed RLPP are from different ways to recognize the same things. Thus, one can use either of them to develop the de-correlated version for any feature extractors.(4) A unified framework of developing a recognition system combining the feature extractor and the classifier together under the same measure metric is designed. Specifically, select an effective classifier, with whose measure metric we can design a mathched feature extraction approach. Taking the regularized K-local hyperplane distance nearest neighbor (RHKNN) classifier as an example, we develop the RHKNN classification oriented local discriminant analysis (HOLDA). The recognition performance of the system combining RHKNN with HOLDA can be improved, since under the same measure metric the features extracted by HOLDA should be very suitable for RHKNN.(5) A fuzzy similar neighbor classifation (FSNC) algorithm is proposed. Taking the "similarity"between the samples into account and introducing the "Fuzzy set theory"into the algorithm, the similarity between each query sample and every category can be specified. Based on the obtained similarity, the decision can be made. It is worth to notice that the "similar neighbor"and the "similarity" between the sample and the category can be obtained automatically by taking the advantage of nonnegative sparse representation method. In such a way, the negative influence of artificial on the classification performance can be reduced and should benefit for the higher recognition accuracy.(6) Based on the linear regression classifier (LRC), we develop a kernel LASSO regression classifier (LASSO-KRC). LASSO-KRC is an improved version of LRC. We improve the LRC from the following two aspects:one is to impose the L1-norm on the regression coefficients, such that the measure metric of LRC is more reliable; the other is to extend the regularized LRC to nonlinear case, i.e. the kernel extension of the regularized LRC, such that the samples in the kernel Hilbert space are more separable. As we all know, without an explicit mapping function, it is not an easy thing to develop the kernel version of L1-LRC. To this end, we use the kernel trick and the theory of Calculus, and successfully extend the L1-LRC to nonlinear case. Motivated by this, many least square optimizations imposed with the L1-norm constrain can easily develop their own kernel versions.
Keywords/Search Tags:feature extraction, classification, graph embedding, recursion, sparsity, Fuzzyset theory, Kernel
PDF Full Text Request
Related items