Font Size: a A A

Nonparallel ε-band Support Vector Regression And Its Algorithm

Posted on:2014-01-04Degree:MasterType:Thesis
Country:ChinaCandidate:X Y GaoFull Text:PDF
GTID:2248330398467956Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Support vector machine(SVM), based on statistical theory and the principleof structural risk minimization, is a new machine learning method, and which is avery powerful tool to solve the data mining problems. It can solve some practicalproblems preferable, such as nonlinear,small sample, high dimension, local minimumpoints and can get a better generalization ability. Firstly, SVM is first developedto solve classification problems (pattern recognition), we use the experience of theefective algorithms of classification problems, and then generalize it to the regressionproblems. Support vector regression(SVR) has important theoretical significanceand application prospects for function approximation. When we use the SVR tosolve regression problem, which has always been focused by the researchers’ attentionis how to and reduce the computational complexity and improve the regressionprecision. This is also our research content.The main contents in this paper can be summarized as follows:1. Nonparallel ε-band support vector regression(NSVR). Based on the ideaof twin support vector regression(TSVR), we proposed a new regression algorith-m, which we named nonparallel ε-band support vector regression. The aim ofnonparallel ε-band support vector regression is also to look for two nonparallelhyperplanes, and make each hyperplane can decide an ε-up hyperplane and an ε-down hyperplane. Compared with twin support vector regression, the geometricalmeaning of nonparallel ε-band support vector regression is more close to standardsupport vector regression, and the computer complexity is reduced. The experimen-tal results also indicate nonparallel ε-band support vector regression is not onlygood generalization performance, but also efectively.2. Primal nonparallel ε-band support vector regression(PNSVR). The ideaof the new algorithm is introduce a loss function to the optimization problem oftwin support vector regression, and rewrite its optimization problem into an uncon-strained optimization problem. Then use a quadratic function to approximate its loss function. So we can go directly solve the primal optimization problem througha set of linear equations. The algorithm not only reduces computational time, butalso not lose its accuracy. The numerical experiments also verify the superiority ofthe proposed algorithm.
Keywords/Search Tags:Regression problem, Support vector regression, Twin support vectorregression, Primal twin support vector regression
PDF Full Text Request
Related items