Font Size: a A A

Research On Models And Algorithms Of Twin Support Vector Machines

Posted on:2017-04-27Degree:DoctorType:Dissertation
Country:ChinaCandidate:X J XieFull Text:PDF
GTID:1108330485469039Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Twin support vector machines (TSVM) are a nonparallel support vector machines which aim to generate two nonparallel hyperplanes such that one of the hyperplanes is closer to one class and has a certain distance to the other class. They solve a pair of relatively small scale quadratic programming problems. They are about four times faster than support vector machines (SVM). Their performance is often better than SVM. In recent years, TSVM have developed very quickly and have been successfully applied in the fields of pattern recognition, data classification and function fitting and so on. Multitask learning, multi-view learning and semi-supervised learning of SVM have attracted a large number of researchers to study. In this paper, twin support vector machines are extended to multitask learning, multi-view supervised learning, multi-view semi-supervised learning and semi-supervised learning frameworks and we use the PAC-Bayes theory to analyse the generalization performance of twin support vector machines.In the framework of multitask learning, we first present direct multitask twin support vector machines (DMTSVM). This method is similar to multitask support vector machines (MSVM). The classifiers of tasks share a common underlying representation and each task further has its own bias. Simultaneously, in order to eliminate outliers sensitivity defect, we propose centroid twin support vector machines (CTSVM) by weighting the distances from class centroids to hyperplanes. Then CTSVM are extended to multitask learning framework in the same way called multitask centroid twin support vector machines (MCTSVM).In the framework of multi-view learning, we propose multi-view twin support vector machines (MvTSVM) corresponding to multi-view supervised learning and multi-view Laplacian twin support vector machines (MvLapTSVM) corresponding to multi-view semi-supervised learning. These two methods combine two views by introducing the multi-view constraint like SVM-2K. MvLapTSVM are based on MvTSVM and in addition increase the square loss and Laplacian regularization drawn lessons from Laplacian twin support vector machines (LapTSVM).In the framework of semi-supervised learning, we use a new regularization called tangent space intrinsic manifold regularization (TSIMR), which not only can capture the local manifold structure from labeled and unlabeled examples, but also include the classical Laplacian regularization term. We combine it with twin support vector machines to semi-supervised learning called tangent space intrinsic manifold regularization twin support vector machines (TiTSVM).An important reason that SVM is widely used is that they are supported by strong statistical learning theory. However, relatively little is known about the theoretical analysis of twin support vector machines. As recent tightest bounds for practical applications, PAC-Bayes bound and prior PAC-Bayes bound are based on a prior and posterior over the distribution of classifiers. Finally, we use the PAC-Bayes theory to analyse the generalization performance of twin support vector machines.In order to evaluate the proposed methods in this paper, we have made comparative experiments on a number of real-world data sets. The experimental results verify the effectiveness of the proposed methods.
Keywords/Search Tags:Support vector machines, Twin support vector machines, Multitask learning, Multi-view learning, Semi-supervised learning, PAC-Bayes bound
PDF Full Text Request
Related items