Font Size: a A A

Research On Kernel Selection Of Support Vector Machine

Posted on:2008-12-22Degree:DoctorType:Dissertation
Country:ChinaCandidate:L K LuoFull Text:PDF
GTID:1118360242979139Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
In the last ten years there have been very significant developments in the theoretical understanding of Support Vector Machines (SVMs) , proposed by Vapnik and others, as well as algorithmic strategies for implementing them, and applications of the approach to practical problems.Nowadays, the selection of the SVM-kernel with suitable form and parameters (Kernel Selection) has become a key-point both in theoretical research and application consideration. In fact, the nonlinear processing ability of SVM and the structure of the separating function are both largely decided by the choice of individual kernel function, and actually there are still a lot of difficulties on practice.As a research work focused on kernel selection of SVM, this paper has mainly discussed the following problems:1. On the modeling of SVM, the principle of bisecting closest points under L2-norm is firstly introduced. The relation between the solutions based respectively on the bisecting closest points principle and the maximum margin principle is then deduced, and the equivalence is established on these two solutions. The advantage of bisecting closest points method is showed, including of the better model character, the more intuitive geometric significance, and the optional nearest point algorithm. A SMO typed algorithm for the model based on bisecting closest points principle under L2-norm is also presented.2. On the aspect of linear separable structure of sample set in feature space, a necessary and sufficient condition is obtained based on null space of kernel matrix.3. On the aspect of selecting the base-kernels in kernel learning, a new concept of rank space diversity of matrices is firstly proposed; it is considered as a diversity measure for the base-kernel matrices."Rank space diversity of base-kernel matrices should be as big as possible"is then deduced as a rule for the selection of base-kernel matrices. The kernel learning model based on bisecting closest points principle under L2-norm, as well as it's solving algorithm, are given, and the validity of this rule is showed by some experiments. 4. On the aspect of the criterion of kernel evaluation, a robustness concept on separating function is firstly proposed based on the anti-disturbance ability of samples. By its properties, the maximum robustness of separating function is proposed to be a criterion for kernel evaluation. Experiments on the comparison among classic k-fold cross validation, minimum support vectors and maximum robustness methods show that our proposition is efficiency, which overcomes the shortages of high time cost for k-fold cross validation and the unstable testing accuracy for minimum support vectors.5. On the aspect of kernel learning, a new method is proposed, in which the base-kernels are designed on each attribute and the robustness of separating function is maximized. The corresponding solving algorithm of this kernel learning model is presented, and the validation and advantages of our method is shown by some numerical experiments.
Keywords/Search Tags:SVM, Kernel Selection, Rank Space Diversity, Robustness, Bisecting Closest Points
PDF Full Text Request
Related items