Font Size: a A A

Classification Methods Based On Support Vector Machines And Manifold Learning

Posted on:2009-04-16Degree:DoctorType:Dissertation
Country:ChinaCandidate:X Y TaoFull Text:PDF
GTID:1118360245468510Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
Classifier design and feature extraction have been two important research topics in the area of pattern recognition. Being activated by the wide applications in both military and civil areas, they are made rapid developments, and many new algorithms are proposed, among which support vector machine (SVM) and manifold learning appear prominent. Supported by the National Defence Fundation and according to the drawbacks of the available SVM and manifold learning for the real applications, this dissertation mainly involves four parts: multi-class SVM algorithm, the unbalanced data classification, proximal SVM ensemble, the feature extraction methods based on the manifold learning. A series of new effective methods have been proposed.In sum, the main research fruits achieved in this thesis are given as follows:By combining the nonlinearity and topological ordering of SOM, a multi-class SVMs based on the SOM decoding algorithm is proposed. For the confidence of the binary classifiers is completely considered, the classification performance of the new method is improved. And its effect is more obvious especially for the simple OVA encoding strategy.A modified PSVM algorithm, namely MPSVM, is developed for the unbalanced data classification with the application area of PSVM expanded. By introducing a diagonal matrix into the original PSVM optimal problem, the different penalty factors are assigned to the positive and negative training sets, which make the new method adaptive to the unbalance data classification. Based on the Lagrange equations, Sherman-Morrison-Woodbury formulae and the rectangle kernel, the decision functions for the linear and nonlinear MPSVM are achieved. The experimental results illustrate the effectiveness of the new method.Feature selection for ensembles has been shown an effective strategy for ensemble creation due to its ability of producing a set of better feature subsets for ensemble learning, which makes the component classifiers disagree. A new PSVMs ensemble method based on feature selection is presented, in which a set of suitable feature subsets are selected for component PSVMs by Relief (F) algorithm. For the quality and diversity of the components are considered at the same time, the generalization performance is improved. The experimental results on UCI datasets and the radar dataset demonstrate that the new approaches have better generalization power and robustness. Based on the Neighborhood Preserving Embedding (NPE) algorithm, a novel feature extraction method called ONPE is proposed. First, a function which reflects the locality preserving power of the projective vectors is defined. Then, with the neighborhood preserving function used as the objective function and the orthogonal constrained conditions added to the original optimal problem, the iterative formulae for finding a set of orthogonal optimal projection vectors are deduced. Compared with NPE algorithm, the orthogonal vectors have the better locality preserving power, thus the stronger discriminant power can be gotten and the error rate reduced.Combined with the kernel trick, kernel NPE (KNPE) is presented, which deserves to preserve the local manifold structure in the nonlinear feature space. Since the higher order information of the data is used, the description for the samples is more colorful and detailed. And the merits of NPE are maintained to make the extracted feature more effective. To avoid computing the inverse matrix of the positive semi-definite kernel matrix, a transformed optimization problem and QR decomposition are used. And based on the deep analysis on KNPE, the nature of KNPE is gotten: KNPE=KPCA+NPE, following an easy solution for KNPE.
Keywords/Search Tags:Support vector machines, Multi-classification, Unbalanced data, Relief(F), Ensemble learning, Manifold learning, Orthogonal NPE, Kernel NPE
PDF Full Text Request
Related items