Font Size: a A A

Research On The Design And Application Of L1 Norm Large Interval Classifier

Posted on:2020-10-15Degree:MasterType:Thesis
Country:ChinaCandidate:Z Y KouFull Text:PDF
GTID:2438330626451010Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Support Vector Machine(SVM)aims to seek an optimal decision surface by minimizing the empirical error and,simultaneously,minimizing the structural risk which measured by so-called margin.It has achieved many successful applications in the fields of machine learning and pattern recognition.In the view of its model design,due to the L2 norm point-to-plane distance can be analytically represented,the margin is derived from the distance and can be interpreted as the distance between the two support planes.According to PAC(Probably Approximately Correct)theorems,the generalization of model,saying classifiers,can be measured by margin.That is,the bigger the margin is,the lower classification error comes.However,for the non-L2 norm,for example,L1 norm,it is difficult to compute L1 norm margin because of its non-differentiability.Thus there has seldom report about non-L2 norm large margin machines in literatures.On the other hand,it is well-known that the classic L2-norm SVM is sensitive to outliers.In the view of computation,training SVM is also time-consuming,for its leading problem is solved by a quadratic programming.When facing large-scale classification tasks,SVM suffers from huge training burden and limit memory storage.Under such circumstances,in this paper,our contributions mainly include the following two parts:1)Based on our previous work,i.e.,the analytical representation of the point-to-plane distance under the L1 norm,we design an L1 norm Maximum Margin Classifier,termed as L1MMC.The main characteristics list in 4 folds:(1)The margin is directly derived from the L1 norm metric.That is,it can be also analytical represented under the L1 norm minimization problem;(2)The optimization objection,similar to SVM,also aims to minimize empirical risk and maximize the margin;(3)L1MMC can be solved by linear programming,rather that quadratic programming in classic SVM;(4)Experimental simulation reports that on some datasets,its test correctness is comparable or even better than that of SVM.2).Inspiring by L1MMC,the foresaid idea is also introduced into twin support vector machine(TWSVM),and we propose new version TWSVM with L1 norm metric,termed as L1TWSVM.Compared to TWSVM,the proposed LITWSVM has the following characteristics:(1)Inheriting geometrical interpretation of TWSVM,L1TWSVM also aims to seek two fitting hyperplanes respectively,and each plane corresponds to a convex optimization problem under the goal of minimizing structural risk and empirical risk.(2)The leading problem only needs to be solved by linear programming,rather than smaller scale quadratic programming in TWSVM.(3)Owing to L1 norm metric,our proposed L1 TWSVM is immune of outliers to a certain extend.
Keywords/Search Tags:L1 norm, support vector machine, margin, linear programming
PDF Full Text Request
Related items