Font Size: a A A

The Study Of Algorithms For Non-Linear Optimization Problems

Posted on:2016-05-28Degree:MasterType:Thesis
Country:ChinaCandidate:C R LiFull Text:PDF
GTID:2308330461974062Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
In this paper, the author focus on nonlinear optimization algorithms and do some research on Support Vector Machines (SVM) in terms of classification accuracy and training speed, the main research achievements are described in the following three parts.Firstly, a new SVM classification model (DSVM) that constructs dual sub-models is proposed. Given a binary classification problem, the original SVM separates samples with maximum margin. Experiments show that SVM is not good at classifying samples near to the decision boundary though the overall performance is generally good. DSVM is proposed to overcome the problem. The first sub-model classifies samples that outside its channel alone and cooperates with the second one to improve its inner channel classification accuracy. The second sub-model is trained using the first one’s support vectors and misclassified samples respectively for training and testing with cross validation strategy.Secondly, there are research findings on parallelizing sequential minimal optimization (SMO). To construct a SVM model is equivalently to solve a convex quadratic programming (QP) problem, which require larger storage space as the training set is bigger. SMO is an effective decomposition method to deal with storage limitation. It iteratively selects and updates one violation pair until the object function is converged. There was a parallel version of SMO (called PSMO) proposed to decrease the number of iterations by selecting and updating several violation pairs simultaneously. However, PSMO couldn’t guarantee the convergence as the violation status of multipliers are influenced each other. Two improved versions of PSMO (IPSMO and IIPSMO) are proposed in which convergence issue is successfully resolved. IPSMO intelligently chooses between single-pair and multi-pair update mode, the one corresponding to larger improvements of the object function is chosen. Convergence is guaranteed since IPSMO keep a strict decrease of the object function and the optimal value is limited. IIPSMO is proposed to overcome the time-consuming issue in IPSMO. IIPSMO repeatedly pre-examines the violation status of the rest candidate pairs when the former chosen pairs are updated, each chosen pair must be a violation pair, the strategy is greedy as the maximum violation pairs are selected from the candidate set in each step. Experiments show IIPSMO is generally better than IPSMO in terms of iterations and training time.Thirdly, an improved version of LIBSVM algorithm (WSS 3-5) is proposed. There are two main working set selection algorithms:Maximal Violation Pair and LIBSVM, the former relates to the violation rules and the latter relies on the improvement of the object function. WSS 3-5 combines virtues of the above algorithms:violation multipliers that pace largely and exert relatively significant improvements to the object function are preferable, which should also meet the default violation conditions. WSS 3-5 is also used with IPSMO and IIPSMO in the parallel scenario, the result is satisfactory.
Keywords/Search Tags:Support Vector Machine, SVM, convex quadratic programming, sequential minimal optimization, SMO, parallel algorithm, working set selection
PDF Full Text Request
Related items