Font Size: a A A

Experimental study and geometrical analysis of a linear programming support vector machine

Posted on:2012-01-05Degree:Ph.DType:Dissertation
University:Rutgers The State University of New Jersey - New BrunswickCandidate:Ye, JiankuanFull Text:PDF
GTID:1458390008498931Subject:Computer Science
Abstract/Summary:
This dissertation describes a systematic study of a linear programming SVM proposed by Vapnik, which directly minimizes the ratio of support vectors to the number of training samples in order to achieve best generalization ability. It mainly focuses on the properties of support vectors in linear programming (LP) SVM, including their numbers, class distribution and spatial location, as compared to the quadratic programming (QP) SVM. Our results show that the LP SVM achieves comparable performance to the QP SVM on all 50 synthetic data and 4 benchmark data. However, the LP SVM obtains the solution with many fewer support vectors. In all experiments with a linear kernel, the number of support vectors of the LP SVM never exceeds the number of dimensions. The locations of the support vectors of the LP SVM are more distant from the opposite class or the corresponding separating hyperplane than those for the QP SVM. In addition, the distribution of the support vectors between classes can be very biased in LP SVM. By relating the LP SVM to a heuristic two-stage optimization problem and a problem of representing the normal vector of the separating hyperplane by training samples, we carried out a geometrical analysis, and provide an explanation that supports the experimental results.
Keywords/Search Tags:SVM, Support, Linear programming
Related items