Font Size: a A A

Algorithms Research On Feature Extraction And Classifiers Of High-Dimensional And Small Sample Size Data

Posted on:2008-12-01Degree:DoctorType:Dissertation
Country:ChinaCandidate:W D WangFull Text:PDF
GTID:1118360215498539Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
The "small sample size" (SSS) problem arises from the small number of availabletraining samples compared to the dimensionality of the sample space. Therefore, a criticalissue of applying linear discriminant analysis (LDA) and quadratic discriminantanalysis(QDA) is both the singularity and instability of the covarianee matrix. This paperprovided some comprehensive solutions to the problem.First, this paper proposed two methods to produce virtual samples. One of them chose agroup of normal orthogonal vectors to produce virtual samples in a subspaee, which were usedto optimize covariance matrices of per class. The other method got many virtual samples byperturbing training samples. These virtual samples made the covariance matrices of per classbecome non-singularity, so the classifier of quadratic discriminant analysis could be useddirectly. Second, the intelligence feature extraction and classifiers algorithms and a dual-spacealgorithm were put forward to solve the problem of the singularity and instability. This paper,applying the machine-learning theory to design intelligence the feature extraction andclassifiers algorithms, proposed the intelligence feature extraction and classifiers algorithmswith learning function. They could automatically learn from testing samples and renew theirknowledge. Also, the paper put forward a dual-space algorithm, which projected the testingsamples difficult to recognize in one single subspace into another single subspace andclassified. Then, the recognition results of two subspaces were tested together. Moreover, thispaper presented a novel classifier based on orthogonal projection for small sample sizeproblem. The character of the classifier is that it need not calculate the covariance matrices'inverses and the recognition accuracy is better than RDA and nearest neighbor classifier.Based on perturbation features of eigenvalue and eigenvector, the paper pointed outeigenvector of morbid eigenvalue may be perturbed to a great degree. The proposed algorithmcan simplify projection matrix, improve the efficiency of features extraction and then makethe recognition ratio robust.
Keywords/Search Tags:pattern recognition, small sample size problem, feature extraction, classifiers, virtual samples, machine learning, dual-space algorithms, perturbation analysis, face recognition
PDF Full Text Request
Related items