Font Size: a A A

Techniques in support vector classification

Posted on:2002-06-11Degree:Ph.DType:Dissertation
University:Colorado State UniversityCandidate:Martin, Shawn BryanFull Text:PDF
GTID:1468390011994849Subject:Mathematics
Abstract/Summary:
This work falls into the field of Pattern Classification and more generally Artificial Intelligence. Classification is the problem of assigning a “pattern” z to be a member of a finite set (“class”) X or a member of a disjoint finite set Y. In case z∈Rn and X,Y⊂Rn we can solve this problem using Support Vector Machines. Support Vector Machines are functions of the form fz= signi aik xi,z +j bjky j,z+b , * where k:Rn×Rn →R and z is classified as a member of X = {lcub}xi{rcub} if f( z) > 0 and a member of Y = {lcub}y j{rcub} otherwise. We consider three problems in classification, two of which concern Support Vector Machines.; Our first problem concerns feature selection for classification. Feature selection is the problem of identifying properties which distinguish between the two classes X and Y. Color, for example, distinguishes between apples and oranges, while shape may not. Our method of feature selection uses a novel combination of a linear classifier known as Fisher's discriminant and a nonlinear (polynomial) map known as the Veronese map. We apply our method to a problem in materials design.; Our second problem concerns the selection of the kernel k:Rn×Rn →R in (∗). For kernel selection we use a kernel version of the classical Gram-Schmidt orthonormalization procedure again coupled with Fisher's discriminant. We apply our method to the materials design problem and to a handwritten digit recognition problem.; Finally, we consider the problem of training Support Vector Machines. Specifically, we develop a fast method for obtaining the coefficients α i and βj in (∗). Traditionally, these coefficients are found by solving a constrained quadratic programming problem. We present a geometric reformulation of the SVM quadratic programming problem. We then present, using this reformulation, a modified version of Gilbert's Algorithm for obtaining the coefficients αi and βj. We compare our algorithm with the Nearest Point Algorithm and with Sequential Minimal Optimization.
Keywords/Search Tags:Supportvector, Classification, Problem, Hspsp
Related items