Font Size: a A A

Multi-classification Algorithm Based On Nonparallel Support Vector Machine

Posted on:2020-08-21Degree:MasterType:Thesis
Country:ChinaCandidate:R J LiuFull Text:PDF
GTID:2428330572989719Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Support Vector Machine(SVM)is an important algorithm for solving machine learn-ing problems.The algorithm integrates a number of technologies and has received wide attention since its introduction.It has been applied to various fields.Since SVM was originally used to solve binary classification problems,and many problems encountered in practical applications are multi-class classification problems.Therefore,how to ex-tend the two-class classification algorithm to solve multi-class classification problems has important research significance.Nonparallel hyperplane support vector machines have advantages in processing inter-class datasets and large-scale datasets.For the multi-class classification problems,in Chapter 3 based on the nonparallel hyperplane support vector machine multi-class clas-sification algorithm(NHCMC),the first algorithm e-nonparallel support vector machine multi-class classification algorithm is proposed,which is abbreviated as INHCMC.By combining the idea of nonparallel support vector machine(NPSVM),the quadratic loss function in NHCMC is replaced by e-insensitive loss function,and the semi-sparsity is improved.The numerical experimental results show that INHCMC is effective.Since INHCMC use one-against-rest ideas for multi-class classification problems,it will cause datasets imbalance,so it is difficult to select the appropriate parameter ?.In Chapter 4 proposes the v-?-nonparallel support vector machine multi-class classification algorithm based on INHCMC,which is abbreviated as v-INHCMC.By combining the ideas of v-SVC and v-SVR,the parameter ? and the parameter C2 in the original IN-HCMC model are replaced with the numerical parameter v,which resolves the difficulty of selecting the parameter ? in INHCMC,and v can weigh the two targets in the model(maximum interval and minimize the error),and also can control the number of support vectors.The numerical experimental results show that v-INHCMC is effective.The Alternating Direction Multiplier Method(ADMM)is an effective method for solving separable convex programming problem,especially in solving large-scale problems.It has been applied to the optimization problem of machine learning in recent years.In Chapter 5,We use the framework of the alternating direction multiplier method to solve the original problem of INHCMC.In order to deal with large-scale datasets,we use the conjugate gradient method to approximate the inverse of the matrix in the subproblem.The numerical experimental results show that the algorithm is effective.
Keywords/Search Tags:Multi-class classification, Support vector machine, Nonparallel, Quadratic programming, Sparsity, Alternating direction method of multipliers
PDF Full Text Request
Related items