Font Size: a A A

An MPEC Method For Optimal Parameter Selection In Support Vector Machines

Posted on:2008-09-09Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y L DongFull Text:PDF
GTID:1100360218955533Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
The aim of the dissertation is to study the parameter selection problem in supportvector machine. There are three reasons for selecting such a subject: Firstly, as an impor-tant method of machine learning, support vector machine has been used in various fields,and has obtained good effects; Secondly, parameter selection in support vector machine isan important research topic: different parameters result in different generalization; Lastly,the traditional parameter selection method, turning procedure and grid search method,are time-consuming and not easy to operate. The dissertation focuses on the study of pa-rameter selection of support vector machine with L1 loss function. We discuss the model,present the algorithm, and give the numerical experimental results. The main results canbe summarized as follows:1. In Chapter 2, we first present the basic parameter selection models in support vectormachine, including primal model and dual model. Thenwe construct a MPEC modelfor selecting the parameter in support vector machine with L1 loss function. Theobjective function is integer and lower semicontinuous. We discuss the model, presentsome properties of the model, and show the existence theorem of the solution. In thecontext of the normal definition of local minimal point, every point in the constraintis a local minimal point. Thus, we present definitions of minimal cone and minimalblock, establish the necessary optimality condition in terms of minimal cone andminimal block.2. In Chapter 3, we first present a new model to calculate the optimal value of costparameter c for particular problems with linearity non-separability of data. The newmodel is formulated in the form of one of MPEC problems with an integer objectivefunction. In order to overcome the nonsmoothness of the objective function, we usea smoothed concave function to approximate the objective function. Owing to thelower problem is an quadratic program, its solutions are equivalent to solutions of its KKT equalities and inequalities. Thus we transformed the problem into a nonlinearprogramming with smoothed objective function and complementary constraints. Weproved that one of a global solution of smoothed problem is also a solution of originalproblem. Because problem which has complementary constraints does not satisfyconstraints qualification, which is the assumption for nonlinear problem solvers, weuseεslackness technique to deal with this problem. Numerical experiment resultsshow that this model and algorithm are effective in choosing the cost parameter.3. In Chapter 4, we combine genetic method with deterministic method to solve theMPEC problem. Once the parameter is assigned a value, the lower programming isan quadratic convex problem. In other words, it is a standard support vector machine,thus the solution can be obtained by some SVM solvers (we use the SVMlight package).Numerical experiment results show that this method is effective for the selection ofthe parameters of the support vector machine.
Keywords/Search Tags:Nonconvex Constrained Optimization, Nonsmooth Optimization, Mathematical Programming with Equilibrium Constraints (MPEC), First Order Optimality Conditions, Lagrangian, Support Vector Machine
PDF Full Text Request
Related items