Font Size: a A A

Study On Algorithms For Training Support Vector Classification

Posted on:2012-12-25Degree:DoctorType:Dissertation
Country:ChinaCandidate:A G LuFull Text:PDF
GTID:1228330395957207Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Support vector machine is an excellent machine learning method based on statisticallearning theory. Based on the structural risk minimization rule, it can derive the optimal lineardecision function in high dimensional feature space by kernel trick. Moreover, it avoids the thecurse of dimensionality and has good generalization ability. For its good performance, it hasbeen widely applied in pattern recognition, function approximation and density estimation,and becomes a hot topic in machine learning.Support vector machines for classifcation problem are called support vector classifca-tion(SVC). In order to improve the training speed and reduce complexity of support vectormachines, this dissertation mainly concentrates on the research of the algorithms for trainingsupport vector classifcation. The main contributions are listed as follows:1. Firstly, a theorem and its detailed proof are given. Secondly, based on the advan-tages and disadvantages of SMO, a novel model called an improved SVC learning algorithmbased on three-variable working set is presented. This method is derived by solving a seriesof the quadratic programming(QP) sub-problems with three points, and the proposed theoremguarantees that the corresponding relaxation sub-problems are solved analytically so that theproposed method approaches to the optimal solution more quickly. Finally, numerical experi-ments show the new method has a superior performance in saving the computational cost andimproving classifcation accuracy.2. Based on the idea that multi-variable coordinated optimization can reduce the num-ber of iterations and training time, a novel algorithm called quadruple sequential analyticoptimization method for training support vector classifcation is presented. This method isderived by solving a series of the QP sub-problems with four points and these sub-problemsare solved analytically so that the proposed method approaches to the optimal solution morequickly. Moreover, we give the convergence analysis for the presented method. Numericalresults demonstrate that the proposed method has faster speed and smaller complexity thanother algorithms.3. Based on the KKT condition of primal-dual problems of support vector classifcation,we present a reduced second order Mehrotra’s predictor corrector algorithm for training sup-port vector classifcation. Starting with a large portion of the samples, by reduction techniquethe proposed method excludes more and more unnecessary samples as the iteration proceeds.Numerical results show the new method is effcient.4. Firstly, for the primal-dual problems of support vector classifcation, considering thedisadvantages of the interior method, we present an inexact infeasible interior point algorithm for training support vector classifcation and give the global convergence analysis for thepresented method; Secondly, we propose an improved FR nonlinear conjugate method forsolving the system of linear equation and prove that it is globally convergent under not onlyWolfe line search but also Armijo-type line search. Finally, numerical experiments show theproposed method performs well.
Keywords/Search Tags:Statistical learning theory, support vector classifcation, kernel function, quadratic programming, working set, three-variable SVM learning algorithm, quadru-ple sequential analytic optimization algorithm, interior point method, convergence anal-ysis
PDF Full Text Request
Related items