Font Size: a A A

Research On Multi-class Multiple Kernel Learning With Diversity Of Classes

Posted on:2016-02-06Degree:MasterType:Thesis
Country:ChinaCandidate:D Y ZhangFull Text:PDF
GTID:2308330503476713Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
In recent years, Multiple Kernel Learning (MKL) has gradually been a hot issue in machine learning, which replaces one single kernel with the combination of multiple base kernels so as to transform the kernel selection to the learning on the combination coefficients of multiple kernels and effectively promote the descriptive ability and the generalization performance of kernel methods. Most of existing MKL methods focus on binary classification problems; however, many real-world problems can be boiled down to multi-class classification problems. Therefore, it is very necessary to generalize MKL to multi-class classification scenarios.Existing multi-class multiple kernel learning (MCMKL) methods usually utilize one-versus-one or one-versus-all strategies to deal with multi-class problems and learn a common kernel combination for all classes. However, the diversity of classes often exists in the real-world data. That is, the data of different classes may be from different distributions or have different feature spaces. If the model learns a common kernel combination for classes, all data will be mapped into the same feature space by the same mapping, which seems clearly unreasonable. To address this problem, we introduce the diversity of classes into MKL by learning a different kernel combination for each class. Furthermore, we propose two algorithms, i.e.,lp-norm multiple kernel learning with diversity of classes (LMKLDC) and lp-norm multiple kernel learning with diversity of classes and maximum multi-class margin (M3_LMKLDC). The main contributions of this paper are as follows:1. The necessity of introducing the diversity of classes into MKL is analyzed. The real-world data are often from different data sources and each class may be from a diffferent data distribution. But, most MKL methods learn a common kernel combination for all classes so that all data will be mapped into the same feature space, which will reduce the expression ability of the model and further decrease its generalization performance.2. A multi-class multiple kernel learning method, named LMKLDC, is proposed. It introduces the diversity of classes into MKL by learning different kernel combinations for each class; meanwhile, it utilizes lp-norm to promote the sparsity of the model. Furthermore, a two-stage optimization algorithm is designed for the corresponding optimization problem.3. Another multi-class multiple kernel learning method, named M3_LMKLDC, is proposed. It further adopts multi-class margin suitable for multi-class problems, including multi-class hinge loss and maximum multi-class kernel margin, to utilize the information between classes effectively. Meanwhile, it takes the diversity of classes into consideration by learning different kernel combinations for each class and utilizes lp-norm to decrease the complexity of the model. The compared experiments validate its effectiveness.
Keywords/Search Tags:Multiple kernel learning, Multi-class classification, Diversity of classes, l_p-norm, Multi-class margin
PDF Full Text Request
Related items