Font Size: a A A

Research On Large Scale Sparse Support Vector Machines

Posted on:2018-12-21Degree:DoctorType:Dissertation
Country:ChinaCandidate:D L LiuFull Text:PDF
GTID:1318330512475548Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Sparse learning is a kind of effective method to deal with the problem of redundan-cy.At present,sparse optimization method has been widely used in signal compressed sensing,image processing and other practical problems,and its theory and algorithms are in rapid development.Due to the redundant and sparse characteristics of large-scale data mining problems,sparse optimization is the best choice.While support vector machine as a general machine learning method,having a solid foundation of statistical learning the-ory,the practical application effect is good,easy to use,less model parameters,is widely used in image,video,audio,text and other different fields.The theoretical research and method of large scale sparse support vector machine is not mature,lacks of theoretical basis,models and algorithms,which is still in the initial stage.For example:1)The test validity index of sparse model,namely how to measure the degree of model sparsity;2)The lack of theoretical basis;3)Solving large-scale problems;4)extensions research,there is a large space for its development.We intend to study its theory and algorithms from the point of view of optimization.This thesis consists of 7 chapters,which is organized as follows:In Chapter 1,we briefly introduce the research background,the significance,the objects and the main works of this thesisIn Chapter 2,we introduce the corresponding algorithms,including the standard support vector machine(SVM),least squares support vector machine(LSSVM),sup-port vector machine with ramp loss(RSVM),twin support vector machine(TWSVM),nonparallel hyperplanes support vector machine(NPSVM)and compare their advantages and shortcoming.Since the NPSVM has the better generalization ability,the following chapters will focus on the NPSVM,on one hand to explore its statistical learning theory foundation,on the other hand to construct NPSVM with more sparsity and large scale ability.In Chapter 3,for the binary classification problem,we propose a robust and sparse NPSVM(RNPSVM),which is based on the ramp loss function.We constructed a brand ramp loss(e-insensitive ramp loss)and introduce it to the NPSVM also with the standard ramp loss.The resulted RNPSVM can deal with the noises and outliers in the training and has less support vectors,therefore the sparseness of the model is improved and the generalization ability is better.The non-convexity of RNPSVM can be dealt with by CCCP strategy.The numerical experiments prove the efficiency of RNPS VM.Chapter 4 constructs the structural risk minimization principle from the angle of the U-SVM,therefore we give out its statistical learning theory explanation.Then the linear programming models of NPSVM and RNPSVM,by changing the convex quadratic programming problems in NPSVM and RNPSVM are constructed.Since the linear programming can be solved efficiently,the corresponding algorithms can deal with large scale cases.In Chapter 5,we first discuss the relationship between LSTWSVM and LSSVM,and prove that LSSVM is the degeneration case of LSTWSVM.Furthermore,we pro-pose a more spares LSSVM(RLSSVM),which combined the e-LSSVM and a newly constructed ramp loss function.Compared with the primal LSSVM,RLLSVM has more sparseness and robustness,and is more suitable for large scale problems.It is also can be dealt with by CCCPChapter 6 proposes the ADMM for the NPSVM and RNPSVM.ADMM is the ef-ficient method to solve large scale problems.After constructing the corresponding op-timization problems suitable for the ADMM,we realized the application of ADMM.Preliminary experiments prove the efficiency.In Chapter 7,we summarize this thesis and design the plans for future.
Keywords/Search Tags:Large scale, sparse learning, support vector machine, optimization, loss function, data mining, classification problem.b
PDF Full Text Request
Related items