Font Size: a A A

A Study Of Support Vector Classifiers Based On The Theory Of Minimum Enclosing Ball

Posted on:2011-06-25Degree:MasterType:Thesis
Country:ChinaCandidate:Y XiaoFull Text:PDF
GTID:2178360332956100Subject:Systems analysis and integration
Abstract/Summary:PDF Full Text Request
Support Vector Machine is a novel algorithm of machine learning issued in 90's of last century, because of its excellent learning performance, it also has successful applications in many fields, such as: handwriting digit recognition, text auto-categorization, faces detection, etc. In order to solve a complicating classification task, it maps the vectors from input space to feature space in which a linear separating hyperplane is structure. As a structure risk minimized implement, SVM has the advantages of global optimization, simple structure and high practicability. But when dealing with the issue of large-scale data sets, this method has many shortcomings such as the computation time,the memory expenditure and the computation accuracy. So how to improve the training efficiency and generalization ability of SVM on large-scale learning problems is a very good academic topic of great practical meanings.Aimed at the large-scale data sets for classification problems, this paper presents a fast training algorithm for support vector machines based on the theory of Minimum Enclosing Ball (MEB). Its main research contents are as follows:(1) At first we introduce the basic idea of statistic learning theory, based on the basic concept of SVM theory and training algorithms, we study a variety of more universal training algorithms about support vector machine and compare the pros and cons of them, especially the Platt's SMO (Sequential Minimal Optimization) algorithm, including the training point selection, optimization and optimized reset.(2) SMO algorithm is efficiency for large-scale training sets, but it still has some shortcomings, including slow training speed, large memory requirement, etc. Therefore, in the selection of two sample points, this article introduces Ranking method to the process of inner loop for the SMO algorithm,we propose this improved algorithm called Q-SMO. Sort by taking the first five values of E1 ? E2 to test whether the value of the objective function is decline or not, so this way can improve the convergence of the algorithm. At last we enhance the training speed of the algorithm by increasing the cache technology.(3) For Ivor W.Tsang proposes core vector machine algorithm based on the MEB problem, this paper gives detailed derivation for why two-class SVM can be viewed as MEB problems and then introduces initialization of the algorithm and probabilistic speedup method. Finally, it combines CVM and Q-SMO for solving the optimization problem of core set, and then proposes an improved algorithm called QS-CVM to handle large-scale data sets.At last, Adult and Extended USPS data sets are tested and analyzed for this improved algorithm and other algorithms, result shows that QS-CVM algorithm can significantly improve the speed of SVM training.
Keywords/Search Tags:Support Vector Machine, Sequential Minimal Optimization, Minimum Enclosing Ball, Core Vector Machine
PDF Full Text Request
Related items