Font Size: a A A

Research On Non-sparse Multi-Kernel Support Vector Machine

Posted on:2016-05-06Degree:DoctorType:Dissertation
Country:ChinaCandidate:Q H HuFull Text:PDF
GTID:1368330482957973Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Kernel method is an effective way to solve non-linear pattern recognition problems in machine learning field, which measure the similarities between the examples of xi and xj by means of a kernel function k(xi, xj) The input data is first map into the high dimensional feature space H through a non-linear map, then a linear decision function is found in the space. The traditional kernel-based methods are mostly single kernel ones which based on only one feature map. Multiple kernel learning method is an effective way to solve the problem of kernel selection, which has more interpretability and greater scalability, and can often acquire better generalization performance in solving real problems in contrast with the traditional single kernel-based ones.Usually, Considering the linear convex combinations of several basic kernels is an simple and effective way of multiple kernel learning. The representation of the input data in the feature space is translated into the selection of the basic function and combination coefficients under the multiple kernel learning framework. In recent years, a lot of excellent multiple learning methods are investigated and used widely. LP norm constraints should be imposed on combination coefficients of the basic kernels to avoid of the condition of overfitting. P equal to 1 will lead to the spares results of coefficients because most of then equal to 0 under this condition. The spares result has the advantages of that the model has better interpretation in terms of kernel selection because the irrelevant and high cost kernel functions will be discard, which lead to reduce redundancy and improve operational efficiency. So Kloft et.al. provide a non-spares multiple kernel learning method that impose Lp (P>1) norm constraints on combination coefficients. Compared with L1 norm constraints ones, it has more powerful ability of anti-noise and better robustness.Support vector machine(SVM) is an effective learning method and used widely in machine learning field. Compared with other method, it has more advantages when deal with small size of examples, non-linear and high dimensional pattern recognition problems. SVM is also an kernel method, the same as others, it also faces the problem of kernel function and corresponding parameter selection when kernel trick is used. The traditional way is manually to select and adjust them in the light of experience, which is often not reliable as the lack of the necessary theoretical support.We introduce multiple kernel learning method into SVM, and solving the problem of kernel function selection and parameter settings. The goal is to come up with effective multiple learning methods, while improving SVM's generalization performance. Specifically, we completed the work including the following four areas:(1) We proposed an novel non-sparse multiple kernel learning method, namely QN-MKL, which solve the model in the primal. The traditional methods are usually transform the objective function into saddle point problem and solve it in the dual of the primal problem. But to solve the primal problem and solve its dual one are equivalent. A lot of studies suggest that directly solving the problem in the primal often has better convergence properties than that in the dual. Subgradient and quasi-Newton method are used to solve the standard SVM. Super linear convergence properties of quasi-Newton method makes QN-MKL algorithm possess relatively fast convergence speed and better generalization performance compared to other exist method.(2) We proposed a non-sparse mutiple kernel semi-supervised SVM learning method LP-MKL-S3VM. We use quasi-Newton method, simulated annealing and local search method based on the pair exchanged tags to optimize the objective function. We both add basic kernels and malifold kernels into the kernel pool. Then the geometric properties of data can effeceint used in the learning process to overcome the limitations of ones that based on only single clusting assuption. Simulation on artificial and real datasets demonstrate the effectiveness of proposed algorithm.(3) We proposed a non-sparse multiple kernel support vector regression learning method named NS-MKR. Experinments on several artifical and real datasets show that our proposed method has a smaller fitting error compared with the existing spares multiple kernel learning ones without increasing the complexity of the modal.This indicates that the non-sparse kernel combinations is helpful for improving the performance of the algorithm and enhancing the accuracy of the modeling, and improving the interpretability and robustness.(4) We proposed a non-sparse mutiple kernel learning method Lp-MKLBoost, which based on boosting framework. The optimal weaken classfier is obtained through kernel fusion which learned by non-sparse MKL method in evry iteration of boosting process.Through imposing L2 norm constraint on combination coefficients, the best kernels are reserved and the others are discard, thus some useful feature information are kept. The proposed method both possess the characteristics of essemble learning method and MKL method.
Keywords/Search Tags:Non-sparse Kernel Combination, Multiple Kernel Learning, Support Vector Machine, Support Vector Regression, Ensemble Learning
PDF Full Text Request
Related items