Font Size: a A A

Study On Some Problems Of Support Vector Machine In The Primal Space

Posted on:2010-01-05Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y Q LiuFull Text:PDF
GTID:1118360302991054Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Support vector machine (SVM) has been a dominant machine learning technique for more than a decade, however, most of its algorithms were proposed in allusion to its dual problem in the dual space. Research indicates that solving the primal problem is also an effective approach for training SVM in recent years. As people make an intensive study of SVM in the primal space, various problems meted in application were solved in the primal space, such as the problem of semi-supervised learning. But as a whole, the research on SVM is not familiar and perfect in the primal space. Therefore, this dissertation mainly focuses on the study of the following four problems of SVM classification algorithm in the primal space.To deal with the problem of poor approximation performance of smoothing functions available of smoothing support vector machines, plus function was transformed into an equivalent infinite series. Thus a family of polynomial smoothing functions were derived. The properties of them were discussed. It is shown that the approximation accuracy and smoothing rank of polynomial functions can be as high as required. Finally, the polynomial smoothing functions were used to solve the generalized support vector machine.Semi-supervised SVM makes use of the large collection of unlabeled data jointly with a few labeled examples for improving generalization performance. Finally, a non-convex optimization problem was obtained. We adopted two strategies for minimizing above optimization problem: combinatorial optimization and continuous optimization. The way of combinatorial optimization is presenting a self-training semi-supervised SVM classification algorithm, whose subprogram use the above obtained polynomial smoothing functions to solve the standard SVM in the primal. After that, a polynomial smooth semi-supervised support vector machines classification algorithm was presented in the way of continuous optimization. The introduced polynomial functions have a good command of theory and have high approximation accuracy in high density regions of samples and poor approximation performance appear in low density regions of samples.Direct search method is a common unconstrained optimization technique, it is different from the cyclic algorithms used in SVM former, which update all components of w at a time. However, direct search method updates one component of w at a time by solving a one-variable sub-problem. On account of the simpleness and practicability of direct search method, three algorithms of the method were used to solve linear SVM. The three algorithms are Hooke and Jeeves pattern search algorithm,Rosenbrock coordinate-turning method and Powell's direction acceleration method. The detailed solving algorithm was given and the complexity of the algorithm was analyzed.The essential reason of the sensitivity of SVM to noise is that the adopted linear loss function has no limits on the penalty loss of noise samples. According to the fact that special loss function is able to control the loss value caused by noise samples, a novel hyperbolic tangent loss function was constructed, and based on the new loss function, the corresponding robust SVM- hyperbolic tangent SVM was proposed.Experiments were performed to verify the above four problems in SVM classification algorithm, experimental results show they can obtain satisfactory learning performance.
Keywords/Search Tags:Statistical learning theory, support vector machine, classification smoothing function, semi-supervised learning, direct search method, Hooke and Jeeves algorithm, Rosenbrock algorithm, Powell algorithm
PDF Full Text Request
Related items