Font Size: a A A

Research Of Nonparallel Hyperplane Classifier Algorithms

Posted on:2016-05-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:X P HuaFull Text:PDF
GTID:1108330479986213Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Nonparallel hyperplane classifier(NHC) is a new kind of machine learning method based on the theory of traditional support vector machine(SVM). Different from the traditional SVM that aims to find an optimal hyperplane according to the margin maximization criterion to separate the distinct classes, NHC algorithms try to seek approximate hyperplanes for each class such that each hyperplane is closest to its own class while furthest from the other class. Compared with the traditional SVM, NHC algorithms can perform significantly better on XOR problems in the linear case. At present, NHC classification method has become a hot research field of machine learning because of its advantage. However, existing NHC algorithms are still not mature and perfect in many aspects and need further research and improvement because they are relatively new classification methods in the field of machine learning. This dissertation does further research work on the basis of the existing reseach. All of the research results can be described as follows.1. Study on the locality preserving twin support vector machine. Aiming at the problem that many existing NHC algorithms do not fully take into account the local geometric and the underlying descriminant information which may be important for classification performance, an locality preserving twin support vector machine(LPTSVM) is proposed by introducing the basic theory of locality preserving projections(LPP) into the NHC methods directly. In order to reduce the computational time complexity effectively, LPTSVM only considers the boundary samples in the constraints of optimization problems. For the possible singularity in LPTSVM, a dimension reduction method based on principal component analysis(PCA) is proposed in theory.2. Study on the kernel based least squares projection twin support vector machine(LSPTSVM) and its corresponding recursive learning method. Aiming at the problem that LSPTSVM can not effectively handle the nonlinear classification, a kernel based LSPTSVM(KLSPTSVM) is proposed by mapping the training samples in the original space to a high dimensional feature space using kernel mapping technology and reconstructing the optimization problems of LSPTSVM. Additionally, in order to promote the classification capability of KLSPTSVM, the recursive learning method, used for further boosting the performance of LSPTSVM, is also extended to the nonlinear case.3. Study on the robust weighted twin support vector machine with local information(WLTSVM). Aiming at the problems that WLTSVM does not fully reflect the underlying similarity information between any pair of samples in the same class, gains lower training efficiency and is sensitive to noise samples, a robust WLTSVM(RWLTSVM) is proposed. In RWLTSVM, a hot kernel function, not the simple-minded definition in WLTSVM, is used to define the weight matrix of adjacency graph, which ensures the underlying similarity information between any pair of samples in the same class can be fully reflected. In order to promote training efficiency, the solution of RWLTSVM reduces to solving just two systems of linear equations as opposed to solving two quadratic programming problems along with two systems of linear equations in WLTSVM. Additionally, RWLTSVM considers the weight of each sample in the contrary class in constructing equality constraints, which makes it be less sensitive to noise samples than WLTSVM.4. Study on the weighted projection twin SVM(PTSVM) and its least squares version. Aiming at the problem that PTSVM does not consider the underlying similarity information between any pair of samples in the same class, a weighted PTSVM(WPTSVM) is proposed. Compared to PTSVM, WPTSVM gets better classification capability because it mines as much underlying similarity information within samples as possible by computing the relative density degree for each data point according to the weights of the intra-class graph of training set. Additionally, WPTSVM considers the weight of each sample in the contrary class in constructing inequality constraints, which makes it be immune to noise samples. In order to reduce the computational time complexity effectively, a least squares version WPTSVM(LSWPTSVM) is proposed. The solution of LSWPTSVM reduces to solving just two systems of linear equations as opposed to solving two quadratic programming problems along with two systems of linear equations in WPTSVM.
Keywords/Search Tags:nonparallel hyperplane classifier, locality preserving projection, kernel mapping, similarity information, least squares
PDF Full Text Request
Related items