Font Size: a A A

Research Of Kernels In Support Vector Classifying Machinne

Posted on:2010-06-30Degree:MasterType:Thesis
Country:ChinaCandidate:H Y LiFull Text:PDF
GTID:2178360275474574Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Support Vector Machine (SVM) is the main content of Statistics Learning Theory developed from 1990s. It is a new tool solving the machine learning problems by optimization method,which is characterized by the optimal separating hyperplanes,the theory of Mercer kernel functions,convex optimization,the sparseness of the solution and so on,so it has not only general optimization,simple structure,but also better performance,especially its better generalization ability,and is broadly and greatly applied in pattern classify,Regression analysis and probability density.The kernel function is the crucial ingredient of SVM,which skillfully solve the problem in the high dimensional.It can achieve different SVM Algorithm by using different kernel.In these kernel functions,many researchers attach importance to Gauss kernel function because of its peculiar property and broad applications.As the kernel function with a direct impact on the performance of SVM,so the research on kernel function has become the focus of concern,one of the core issues as a research need to be addressed in SVM.Paper arming at the selection of kernel function parameter and modifying of kernel function primarily has the following aspects:First,this paper analyzes the kernel parameters of SVM classify models,discusses several frequently-used methods to optimize the kernel parameters,such as Cross validation,Grid search algorithm,Genetic Algorithm and Particle Swarm Optimization.Then summaried their advantages and disadvantages.Based on the fast convergence of particle swarm optimization and genetic algorithm is not into the merits of local optimum,a new method of kernel parameter optimization is proposed.This method is the turn of two algorithms,which is controlled by the iterative controller ,every generation to carry out interactions of the optimal particle number.This method overcomes the particle swarm algorithm is in the most low accuracy or convergence caused by local optimal,also improves classification accuracy .Then,to adapt to the practical problems better, this paper improves accuracy of SVM classification based on Riemannian geometry and experimental data.As former modifying kernel function are impacted by the number and distribution of SVM,a new amendment to the kernel function is proposed based on Burges Reman information-geometrical method and Amair geometrical modifying method..The idea is that modifying the RBF kernel by changing comform exchange which designed to enlarge the size of Gaussian kernel function element , thereby enhancing the effectiveness of SVM classification.The new modifying replaces the distance from the sample point to support vector with the distance between classification boundary and the sample value plane,overcomes fact that the previous kernel function impacted by the number of support vector,and to a large extent,improves the generalization ability of SVM.Last,these methods have been applied to UCI classification Mushroom database,the experiment proved selection method of optimization parameter based on genetic algorithm -particle swarm to achieve good accuracy,good generalization ability.New kernel function for the above-mentioned amendment to optimize the parameters of the Gaussian kernel function,repeat the training of SVM,the experimental data show that a better classification performance compared to the original kernel.
Keywords/Search Tags:Support Vector Machine, Gauss Kernel Function, Genetics algorithm, Particle swarm optimization (PSO), Reman information-geometrical
PDF Full Text Request
Related items