Font Size: a A A

Research And Application Of Nonparallel Support Vector Machine

Posted on:2023-04-06Degree:MasterType:Thesis
Country:ChinaCandidate:W M ZhangFull Text:PDF
GTID:2558306911484704Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
As an important algorithm in machine learning,support vector machine follows both empirical risk minimization criterion and structural risk minimization criterion,and is interpretable theoretically and geometrically.With the development of support vector machine,nonparallel support vector machine,which can improve the computational efficiency,has gradually entered people’s vision,and has become an important expansion of support vector machine.Nonparallel support vector machine makes full use of the sample characteristics of each class to construct two optimization problems,finds an approximate hyperplane for each class,and considers only one class of samples in each optimization problem,which improves the computational efficiency.In this paper,two classical models of nonparallel support vector machine are explored.Using the easy optimization of least squares loss function and the robustness of truncated least squares loss function,robust least squares projection twin support vector machine and nonparallel least squares support vector ordinary regression machine are constructed respectively.In addition,according to the characteristics of the new model,the concave convex programming and the alternating direction multiplier method are used to solve them respectively.Firstly,in order to solve the problem of being sensitive to outliers,a robust least squares projection twin support vector machine is proposed by applying the truncated least squares loss function,and its robustness is proved from the perspective of weighting.Then,the new model is transformed into DC programming to solve it according to its characteristic that the objective function of the new model can be expressed as the difference between two convex functions.In addition,in order to solve the problem that the solution of the new model is lack of sparsity,the incompletely rotated Cholesky decomposition is used to approximate the kernel matrix,and then the sparse solution is obtained iteratively.Therefore,sparse robust least squares projection twin support vector machine algorithm is proposed.Experimental results show that the proposed sparse algorithm is insensitive to outliers and can quickly process large-scale datasets,so it is robust and sparse.Secondly,in order to solve the problem that the optimization of hyper parameters consumes a lot of time cost,a nonparallel least squares support vector ordinal regression model is proposed by introducing the least squares loss function.According to the characteristic that the new model can be transformed into the standard model of alternating direction multiplier method,the alternating direction multiplier method is used to solve the proposed model.Therefore,a more efficient nonparallel least squares support vector ordinal regression algorithm is obtained.Numerical experiments show that the proposed algorithm has comparable mean 0-1 error and mean absolute error on both ordinal regression datasets and real regression datasets,and has fast calculation speed.
Keywords/Search Tags:Proximal Hyperplane, DC Programming, Cholesky Decomposition, Alternating Direction Multiplier Method, Ordinal Regression
PDF Full Text Request
Related items