Font Size: a A A

The Least Squares Support Vector Machine Improved Sparseness Algorithm Research

Posted on:2012-09-11Degree:MasterType:Thesis
Country:ChinaCandidate:Y LiFull Text:PDF
GTID:2178330335451872Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Support Vector Machine (SVM) based on Statistical Learning Theory,it is one of the hot point in machine learning research fields,it is a new machine learning algorithm newly developed in the 1990s, this algorithm can solve the small sample learning problems.Compared to traditional artificial neural network, support vector machines eventually changed into secondary optimization problem, this algorithm preferably solve underfitting and overfitting problem,curse of dimensionality problem,Local minimaproblem and so on in neural networks. support vector machine based on structural risk minimization principle solve Optimal solution, it has a stronger generalization capability than other nonlinear function methods. In many fields has been widely used.Such as pattern recognition,nonlinear function approximation and so on. However, support vector machine algorithm need to solve a Convex quadratic programming problem, so it need large amount of calculation. To reduce support vector machines computational complexity, SuyKens et al proposed least squares support vector machines (LS-SVM), the algorithm use square item in optimization index and only use equality constraints, make quadratic programming problem into a set of linear equations to solve the optimal classification surface, simplifying the computational complexity, but also lose the sparseness of traditional SVM advantages, in order to improve the LS-SVM, aim at LS-SVM algorithm problem were studied in this paper:Firstly,studied the shortcomings of the traditional LS-SVM, and introduced of the density index formula in high-dimensional feature space to sparse LS-SVM , which is different from the traditional sparse algorithms are supported in the original space using the absolute value of Support value to the pruning .Secondly, while the use of density function in the LS-SVM model, and introduce "membership" ,And construct a new fuzzy membership function, Relative to majority of fuzzy membership is based on the original sample space, this paper use each sample point as the center of the sphere in high-dimensional feature space in order to determine for the fuzzy membership, then do the training . Thirdly, Experiment with this algorithm in the MATLAB platform, compared to the standard LS-SVM,this algorithm show obvious advantages .
Keywords/Search Tags:Statistical learning theory, support vector machine, Least squares support vector machine, density index function, fuzzy membership, sparseness
PDF Full Text Request
Related items