Font Size: a A A

Sparse LSSVM Algorithms Based On QR Factorization

Posted on:2018-09-14Degree:MasterType:Thesis
Country:ChinaCandidate:Y L ZhouFull Text:PDF
GTID:2348330542452393Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
As we all know that support vector machine(SVM)is one of effective data mining technologies,which widely used for classification and prediction problems.However,when the training sample size is becoming much bigger,support vector machine needs a high computational complexity to solve the convex quadratic programming problem,and sometimes cannot to be solved.Least squares support vector machine(LSSVM)can convert the convex quadratic programming problem into linear equations problem,which greatly improving the calculation efficiency.However,the solution of the least squares support vector machine is lack of sparseness so that it is very difficult to solve the training problems of large-scale data effectively.Especially when the training sample size is large,in the training process,the kernel matrix formed by large-scale training data is difficult to store,even unable to be used in calculation.It is an urgent requirement for the least square support vector machine(LSSVM)to improve its sparsity level.Therefore,sparse least squares support vector machine emerged as the time require.The key point for sparse least squares support vector machine algorithm is to find the optimal small-scale low rank approximation kernel matrix instead of the kernel matrix for calculation,thereby,the increase of solution sparsity and the reduction of computational complexity can be realized at the same time.In this paper,we have discussed the sparse calculation algorithm from two aspects as below.On the one hand,the traditional sparse least squares support vector machine algorithms get the low rank approximation kernel matrix by choosing a part of samples randomly.In order to make the small-scale low rank approximation kernel matrix approximate the original kernel matrix as best as possible,we use the value stability from QR factorization to maintain a large difference between selected sample parts.In this paper,we take this advantage of QR factorization to iteratively choose a part of columns from kernel matrix,so that we can get the Nystromapproximation of kernel matrix and find the sparse approximate solution of PLSSVM.From then we proposed the algorithm of QRP-LSSVM based on QR factorization.Theoretical analysis and experimental results supported that our QRP-LSSVM algorithm can get very sparse solution,and can get higher accuracy even when the sparsity level is less than 0.05%.This QRP-LSSVM sparse algorithm has good stability and can solve the largescale training problem effectively.One the other hand,in this paper,we will use the advantage of the sample selection method,which maintain the variability between selected samples based on QR factorization,to reduced support vector machine(RSVM).We use the method which selects a subset of samples based on QR factorization to filter the representative sample,so we can get a reduced set of support vectors that make up a series of deficiencies caused by RSVM randomly selection of reduced set.In this paper,we have discussed the algorithms of reduced support vector machine(RSVM)under conditions of different loss function,which select reduced set based on QR factorization.Then we get three algorithms of QRP+SHRSVM,QRP+LSRSVM and QRP+HRSVM respectively based on Square Hinge loss function,Least Square loss function and Huber loss function.Theoretical analysis and experimental results supported that Newton iteration which in the process of the QRP+SHRSVM algorithm based on Square Hinge loss function is simple,and can get higher test accuracy.In addition,we have improved the sparsity level and stability of result by comparing our algorithm with the random selected set of RR+SHRSVM algorithm,which highlight the advantage that filter the representative samples as reduced support vector sets of QRP+SHRSVM algorithm.
Keywords/Search Tags:Sparse LSSVM, QRP-LSSVM, QR factorization, Sparse solution, Reduced set, QRP+SHRSVM
PDF Full Text Request
Related items