Font Size: a A A

A Fast Optimization Algorithm For Support Vector Machines

Posted on:2017-03-02Degree:MasterType:Thesis
Country:ChinaCandidate:J Y KongFull Text:PDF
GTID:2278330485966151Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Support vector machine (SVM) obtains the optimal separating hyperplane or regression func-tion by solving a quadratic programming problem to realize the prediction of the test samples. SVM can give fully play to the advantages of data processing of small samples and get the opti-mal decision function with less samples. At present, SVM has been widely used for classification and recognition problems, such as handwriting recognition, financial forecast, etc. In recent years, some researchers have proposed twin support vector machine (TWSVM) on the based of SVM. TWSVM indirectly gets the decision function by optimizing the two smaller-sized quadratic pro-gramming problems, which makes the learning speed of TWSVM be approximately 4 times faster than SVM.With the development of information technology, more and more data information are ob-tained quickly; on the other hand, the rapid development of society has also led to change infor-mation. Therefore, it is necessary to discuss online learning of various models of SVM.In this paper, we discussed online learning algorithms of twin support vector machine (TWSVM), twin parametric margin support vector Machine (TPMSVM), e-twin support vector regression (e-TWSVR) and twin parametric insensitive support vector regression (TPISVR). The online learning is transformed into a single (double) variable quadratic programming problems by using the Lagrangian dual technique. The algorithms are comparable with or slightly lower than existing online methods, but much lower than that of SVM-based algorithms and show very fast learning speed.
Keywords/Search Tags:Pattern recognition, twin support vector machine, reproducing Kernel Hilbert S- pace, online learning
PDF Full Text Request
Related items