Font Size: a A A

Fast Support Vector Machines Classification Algorithm With Additive Kernel

Posted on:2018-04-21Degree:MasterType:Thesis
Country:ChinaCandidate:X F WangFull Text:PDF
GTID:2348330542952392Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Support vector machine(SVM)is an effective and popular classification learning tool in machine learning.SVM has been widely applied to a variety of problems due to its good generalization performance.However,with the development of science and technology,the scale of the data sets is growing.SVM faces a huge challenge for large scale classification tasks which have millions of training samples or feature dimensions.Linear SVM can efficiently process such large scale problems,but it is inferior in terms of accuracy,compared to the kernel version of SVM.Nonlinear classifiers can achieve relatively high accuracy but the training time is long.Recent progresses have enabled additive kernel version of SVM which draws upon the efficient linear solvers to quickly solve large scale problems and it can achieve higher testing accuracy.Stochastic gradient descent(SGD)is one of the efficient methods to solve large scale problems.In this paper,we use SGD and some improved SGD methods including ASGD,SVRG and Katyusha algorithm to solve the problem of SVM classification with additive kernel.The main works of this paper include the following two aspects.On the one hand,with the properties of additive kernel and Nesterov's acceleration strategy,a new algorithm called accelerated mini-batch stochastic gradient descent for SVM classification with additive kernel(ASGD-AKSVM)is proposed.The algorithm does not need to calculate the inner product of the weight vector and the sample mapped to feature space in gradient update.Instead,according to the representation theorem and the additive property of additive kernel,it can be written as the sum of a scalar function for each feature dimension.Then by the Weierstrass approximation theorem,we can approximation the scalar function on the closed interval by a polynomial function.It reduces the computational complexity greatly and short the training time.At the same time,the algorithm uses Nesterov's acceleration strategy to improve the convergence rate and uses the Look-Up Table to reduce training and test time and save memory usage.Experiments show that the ASGD-AKSVM algorithm can effectively deal with large scale problems and achieve higher classification accuracy and has faster convergence rate.On the other hand,with the SVRG algorithm which is a reduce variance SGD method and Katyusha algorithm which is an improved gradient correction method,we propose K-AKSVM algorithm.The algorithm approximates gradient by polynomial function according to the property of additive kernel,which reduces the gradient computational complexity.Then the reduced variance strategy of SVRG algorithm and the correction strategy of Katyusha algorithm are used to improve the convergence rate of SGD.The experiment results verify the effectiveness of the algorithm.
Keywords/Search Tags:SVM, additive kernel, gradient approximation, Nesterov's acceleration strategy, SGD algorithm, improved SGD algorithm
PDF Full Text Request
Related items