Font Size: a A A

Research On The Key Problems Of Twin Support Vector Machines

Posted on:2015-02-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:H J HuangFull Text:PDF
GTID:1268330422487419Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Twin Support Vector Machines (TWSVM) is a new machine learning methodbased on the theory of Support Vector Machine (SVM). Unlike SVM, forclassification problems, TWSVM wants to generate two nonparallel hyper-planes. Forregression problems, TWSVM aims at generating two nonparallel functions such thateach function determines-insensitive down-or up-bounds of the unknown rgressor.The formulation of TWSVM is very much similar to a classical SVM, however, thelearning speed of TWSVM approximately four times faster than that of the classicalof SVM. At present, TWSVM has become one of the popular methods because of itsexcellent learning performance. Because TWSVM is a relatively new theory in thefield of machine learning, it is not mature and perfect. Therefore, TWSVM needsfurther study and improvement. It is one of difficulty and emphases that study thelearning algorithm of TWSVM. This dissertation mainly does researches on TWSVMwith improving the generalization ability, sleeping up the learning speed andenhancing the robustness. All of the research results can be described as follows.1. Study on smooth TWSVM. Aiming at the low approximation ability ofSigmoid function of smooth twin support vector machines (STWSVM), using CHKSfunction which has better approximation ability as the smooth function, a new versionof smooth TWSVM called smooth CHKS twin support vector machines model isproposed. Furthermore, similar to TWSVM, STWSVM does not consider the differentpositions of samples effecting on its performance. In order to address this problem,we design a membership function method, which gives different importance for eachtraining sample according to the sample point positions. Based on this idea, a methodcalled weighted smooth CHKS twin support vector machines model is proposed. Wehave proved the convergence performance of our algorithm. Finally, the proposedalgorithm is extended to solve the regression problems. Using the discrete PSOalgorithm as the parameters optimization and feature selection method, a new smoothtwin support vector regression, term as smooth CHKS twin support vector regressionbased on discrete PSO, is proposed. We prove the arbitrary order smoothness andconvergence of our algorithm using mathematical method.2. Study on the unconstrained non-differential solving method of TWSVM.Based on KKT complementary condition, unconstrained non-differential optimizationmodel for TWSVM is proposed. An adaptive adjustable entropy function method is given to train the proposed model. The proposed method can find an optimal solutionwith relatively small parameters, which avoids the numerical overflow in thetraditional entropy function method. Finally, the adaptive adjustable entropy functionmethod is used to train twin support vector regression (TSVR) analogously. So anunconstrained non-differential optimization model for TSVR based on adaptiveadjustable entropy function is proposed.3. Study on least squares TSVR and its feature selection method. In order toimprove the computational efficiency of TSVR, in this paper, we propose a novelleast squares twin support vector regression, called LSTSVR for short. LSTSVRattempts to solve two modified primal problems of TSVR, instead of two dualproblems usually solved. The solution of the two modified primal problems reduces tosolve just two systems of linear equations as opposed to solving two quadraticprogramming problems along with two systems of linear equations in TSVR, whichleads to extremely simple and fast algorithm. Theoretical analysis shows that thecomputational complexity of LSTSVR only is relation to the dimension of samples inlinear case. Therefore, LSTSVR provides a kind effective method for solving largesamples problem. In order to improve the speed of LSTSVR in solving highdimensional problems, we propose a feature selection method for LSTSVR. Firstly,replacing all the2-norm terms in LSTSVR with1-norm ones so that we can convertthe formulation of LSTSVR to a linear programming (LP) problem. Secondly, byminimizing an exterior penalty problem of the dual of the LP formulation and using afast generalized Newton algorithm, our method yields very spare solutions, such thatit generates a regressor that depends on only a smaller number of input features. In thelinear case, this method can automatically select the input features.4. Study on least squares twin parametric insensitive support vector regression(LSTPISVR). In this paper, we formulate a least squares version of twin parametricinsensitive support vector regression (TPISVR). Firstly, introducing the least squaresmethod, the two quadratic programming problems of PISVR is converted into twosystems of linear equations. Then, the computational complexity of our algorithm isanalyzed. Further, chaotic cuckoo optimization algorithm is proposed and is used todo the parameter selection.
Keywords/Search Tags:Twin support vector machines, Twin support vector regression, Twinparametric insensitive support vector regression, Chaotic cuckoo optimization
PDF Full Text Request
Related items