Font Size: a A A

Fast Classifications Based On Minimal Hyper-Sphere Of Class Center

Posted on:2007-04-15Degree:MasterType:Thesis
Country:ChinaCandidate:Z SuFull Text:PDF
GTID:2120360242960877Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
As a limited-sample learning theory, Statistical Learning Theory (SLT) has many advantages in pattern recognition, such as its superiority in small-sample, nonlinear and high-dimension problems. It reinforces the theory of pattern recognition and machine learning. Support Vector Machine (SVM) is one of the learning methods that developed from SLT. When deal with the small-sample learning, it comes the optimal solution of the limited information and solves many problems such as model selection, over fitting, nonlinear, dimension disaster in high degree.In this dissertation, we have induced several developed SVM algorithms. The main contents lie in three aspects. Firstly, we have discussed machine learning problems in pattern recognition fields, and systematically introduced the main theory of SVM. Secondly, two fast algorithms for classification were presented in this paper which based on the geometry character of training points. In this way, we could avoid to solve the traditional quadratic program. Compared with traditional methods, the learning algorithms not only increase the training speed, but also decrease the consumption of EMS memory. Then, the characters of the two fast algorithms and their comparison with linear SVM and nearly linear SVM were shown. Finally, we generalized this two methods presented before by importing kernel functions to solve problems of non-linearly classification.
Keywords/Search Tags:pattern recognition, separating hyper-plane, separating hyper-curve, kernel function, Support Vector Machine (SVM), halving the nearest points method, dividing the nearest points proportionally method, the minimal hyper-sphere of class center
PDF Full Text Request
Related items