Font Size: a A A

Research On Support Vector Regression Algorithms And Its Application

Posted on:2013-02-11Degree:DoctorType:Dissertation
Country:ChinaCandidate:F D ZhengFull Text:PDF
GTID:1118330362468666Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
In the past few decades, many of pattern recognition and machine learningalgorithms bothered with overfitting, local minima, training sample is too huge. Basedon statistical learning theory, support vector machine (SVM) partly overcome theseproblems. SVM successfully applied to regression by the introduction ofε-insensitive loss function. Many research of SVM is based on classificationproblems and can't be directly used to solve the regression problems. This paperfocuses on the research of SVR and the main contents are as follows:1. Improved twin support vector regression algorithms.Twin SVR (TSVR) converts the classical quadratic programming problems (QPPs)with inequality constraints to two small QPPs with equality constraints. This paperproposed three modified algorithms based TSVR.①We add a regularization item inthe QPPs of TSVR and implement structural risk minimization principle. Theregression function can more fit the data by tuning the regularization parameter. TheSOR algorithm is used to solve the dual problem of the regularized TSVR. Thegradient algorithm can be used to solve the QPPs of regularized TSVR directly.②Wemodify the equality constraints of QPPs of TSVR to inequality constraints byintroducing a technique as used in proximal support vector machine and add the unitynorm constraint. The result dual QPPs are solved using penalty function approach.③Change the penalty item of slack variable to quadratic penalty item and thus the smallsize QPPs of TSVR can solve by an iterative algorithm. The iterative algorithmconverges from any starting point and does not need any quadratic optimizationpackages. Thus this algorithm is very fast. The experiments on artificial andbenchmark datasets show that the proposed method is competitive with previouslypublished methods.2. Incremental support vector regressionThe incremental learning algorithms are needed when the train sample is too hugeor arrive gradually. This paper proposed two incremental regression algorithms basedon Lagrangian support vector regression (LSVR). The incremental learningalgorithms for LSVR presented in this paper include two cases that are namely onlineand batch incremental learning. LSVR leads to the minimization of an unconstraineddifferentiable convex programming and is solved by an iterative algorithm with a simple linear convergence. The iterative algorithm converges from any starting pointand does not need any quadratic optimization packages. LSVR has the advantage thatits solution is obtained by taking the inverse of a matrix of order equals to the numberof input samples at the beginning of the iteration. The proposed algorithms are solvedbased on the previous computed information, it is unnecessary to repeat thecomputing process. The effectiveness of the proposed method is illustrated withseveral UCI data sets. These experiments show that the proposed method iscompetitive with previously published methods.3. Finite newton Method for Lagrangian support vector regressionLagrangian support vector regression is an effective algorithm, but need manytimes to convergence from a starting point. We use a finite Armijo-Newton algorithmsolving the Lagrangian SVR's optimization problem. Solution is obtained by solving asystem of linear equations at a finite number of times rather than solving a quadraticoptimization problem. The proposed method has the advantage that the resultingoptimization problem is solved with global and finite termination properties.Experimental results on several artificial synthetic datasets and benchmark datasetsindicate that the proposed NLSVR is fast and shows good generalization performance.4. Primal weighted support vector regression.Propose a robust weighted regression algorithm solved in the primal space using aniterate newton algorithm. This algorithm eliminates outliers through weightedapproach. The further the sample deviates from the model, the smaller the weight ofthe loss function, and the affection to the estimation of the model's parameters issmaller, too. We solved the weighted support vector regression in the primal spaceusing an iterative newton algorithm. These experiments on artificial, benchmarkdatasets and prediction of the stock prices show the proposed method is competitivewith previously published methods.
Keywords/Search Tags:pattern recognition, machine learning, support vector regression, twinsupport vector regression, incremental algorithm, weighted support vector machine, robust algorithm
PDF Full Text Request
Related items