Font Size: a A A

Low-rank Representation Based Discrimination And Regression Based Classification

Posted on:2013-08-03Degree:DoctorType:Dissertation
Country:ChinaCandidate:N ZhangFull Text:PDF
GTID:1228330395483723Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
Pattern recognition appeared in the twenties of the twentieth century, and turned to be a subject in sixties. The role of pattern recognition is classification, which attempts to assign each input value to one of a given set of classes. The system of pattern recognition consists of four steps, i.e. data acquisition, preprocessing, feature extraction and classification. Feature extraction and classification are two of the most basic steps in pattern recognition.Feature extraction aims to get effective information which is good for classification from original information of the samples. The most classical methods for feature extraction are PCA, LDA and ICA. In recent years, low-rank representation algorithm becomes a hot topic of machine learning and pattern recognition. Classifier decides the label of a test sample based on the feature vectors learned from the feature extraction method. Now, there are a lot of effective classifiers. The kNN classifier, Bayes classifier, neural network, subspace method, SVM, Fuzzy set and Rough set method are very classical. The sparse representation based classifier was presented and shown great potential for pattern classification. The collaborative representation based classifier is proposed to discuss whether L1-norm sparse makes the sparse representation-based classifier powerful for face recognition.The major research work in the dissertation includes the following four aspects:At first, the low-rank representation (LRR) based discriminative projection analysis method (LRR-DP) was proposed. We begin with the weights in the affinity matrix gained by LRR, and use class information to further enhance the clustering of training samples. Specifically, we try to make the within-class sample clustering as compact as possible, the between-class clustering as separate as possible and the representation error (the transformed noise) as small as possible. We assume the low-rank structure of the data are preserved after a linear projection and try to find such a projection.Secondly, locality based sparse representation-based classifier (SRC) and collaborative representation-based classifier (CRC) methods were presented. We first find its K nearest neighbors of a given testing sample from all training samples. Then, we represent the testing sample with these K nearest neighbors. The representation coefficients can be obtained by solving an optimization problem. Finally, we decide the class of the testing sample based on the residual which is computed between the testing sample and the nearest neighbors belonging to each class. The proposed Local SRC and Local CRC classifiers turn out to be more efficient and effective than SRC and CRC in most cases.Then component-based Global k-NN (CG-k-NN) Classifier was proposed by taking advantage of the structural information of the local neighbors for enhancing the classification performance. We choose k nearest neighbors of a given testing sample globally at first, and then use these neighbors to represent the testing sample via ridge regression. In the further step, we construct the component image of each class by using the intra-class images from the k nearest neighbors and the corresponding representation coefficients. Finally, the testing sample is assigned to the class that minimizes reconstruction residual. The structural information of data and useful component images make CG-k-NN Classifier more efficient and effective.Finally based on Lasso regression and Ridge regression, two kinds of design methods were presented, i.e. Population Lasso Classifier, Class Lasso Classifier, Population kNN Ridge Regression Classifier and Class kNN Ridge Regression Classifier. We compare the Population based classifiers or Class-sample based classifiers, and find that Population based classifiers are suitable for small sample size problems and the Class-sample based classifiers are suitable for large sample size problems.
Keywords/Search Tags:feature extraction, face recognition, palm print recognition, characterrecognition, low-rank representation, sparse represesntaiton, linear regression
PDF Full Text Request
Related items