Font Size: a A A

The Theory Research Of Algorithm On Support Vector Regression And Application

Posted on:2007-05-30Degree:DoctorType:Dissertation
Country:ChinaCandidate:S H CengFull Text:PDF
GTID:1118360212968496Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
Support Vector Machine (SVM) was invented by Boser, Guyon and Vapnik, and was put forward at the 5th annual ACM Workshop (1992) on Computation Learning Theory (COLT) for the first time. It was a new milestone in the field of intellectual computation after artificial neural net. SVM is based on strictly justified statistical studying theory. It maps data from sampling space to higher dimensional characteristic space by the kernel functions and converts nonlinear problem into linear divisible problem to get optimum relation. This is a great innovation in theory. SVM has rigorous mathematic foundation, training results only having relationship with Support Vectors (SVs), thus SVM has strong generalization and becomes important tool for solving nonlinear problem. Therefore, it is of broad concerned in the field of intellectual computation and broadly used in Pattern Classification and Regression.Based on pre-scholars'achievement and first-phase working of this study, they was primarily studied in this paper, the algorithms of Support Vector Regression (SVR) under the condition of large sample set as well as the applications of SVR in rejecting exceptional data, failure testing, the nonlinear definition of a target variable which can't be precisely defined in data mining, and creaming of vanadium and drawing off sulphur from hot metal. The aim of studying SVR algorithms is to construct rapidly SVR and the strategies of developing SVR algorithms is to decrease the kernel matrix scale to reduce time of training SVR by searching SVs progressively. The innovative ideas of this paper are as follow:(1) Briefly demonstrated SVR has approximate hyperplane. According to the nature that the training results of SVM only relate to SVs, in other words, no relationship with unsupported vectors, and the trait that SVs which distributes round hyperplane inevitably distributes round the approximate hyperplane as well, proposed an algorithms of building SVR ---- Support Vector Stepwise Regression (SVSR) algorithm. The core thought of the algorithm is: converting training sample to characteristic space by kernel function k (x , x_i )( x_i is SV); calculating the distance from sample points to approximate hyperplane in characteristic space followed by ascending sort of those distance; extracting m samples that their distances were shortest and combining them with k SVs of the approximate hyperplane previous trained to get new training sample subset; training new approximate hyperplane; progressively searching...
Keywords/Search Tags:Support Vector Regression, Combinatorial Optimization, Integer Programming, Exceptional Data, Failure Testing, Target Variable, Creaming of Vanadium and Drawing off Sulphur from Hot Metal
PDF Full Text Request
Related items