Font Size: a A A

The Research And Application Of Multiple Kernel Learning Method

Posted on:2016-04-02Degree:MasterType:Thesis
Country:ChinaCandidate:R F ZhangFull Text:PDF
GTID:2308330464963622Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Multiple kernel learning(MKL) method is has become a hot research topic in the field of machine learning currently. The kernel trick is an effective way to solve the problem of non-linear pattern analysis. However, it’s known that the sample data has the characteristic of diversity and uncertainty, in order to achieve good generalization performance, combining multiple kernels has become an inevitable treand. MKL model model is based on kernel methods, which is more flexible. Recent theoretical researches and real applications have shown that using multiple kernels instead of a single one can improve performances and enhance the interpretability of the decision function. Through the kernel method, the problem of data representation in the feature space can be transferred to the choice of kernel weights in MKL framework, so different choice of kernel weights can construct different MKL models. The sparseness of the kernel weights will produces an effect on the performance of MKL models. We can use different methods to obtain the optimal kernel matrix in the training process and the computational efficiency of each optimization methods has their own pros and cons. Therefore, it is of great significance to make an intensive study of MKL.For binary classification problem, this paper is based on support vector machine, which describes the basic framework of MKL and introduces the classical sparse and non-sparse MKL. Use this as a foundation, this parper proposed a generalized sparse MKL model and a framework of non-sparse MKL based extreme learning machine. The main work is as follows:1. This paper makes a deep research of sparse and non-sparse MKL based supportvector machine. The UCI dataset is test in our experiments, and we analyze thesparseness of the kernel weights which affect the performance of MKL, and thecomputational efficiency of different optimization skills.2. This paper proposed a new MKL model called generalized sparse MKL(GSMKL)model by introducing a constraint on the combination of the L1-nrom and Lp-norm(p>1) on the kernel weights based on the primal MKL model, which can adjusts thesparseness flexibly. In order to prove the properties of the model, we design thecorresponding training algorithm. Experimental results on the synthetic and UCIdataset have shown that the improved algorithm is efficacious and feasible.3. This paper proposed non-spare MKL approach based on extreme learning machine(ELM). This method enhances overall computational efficiency and training rate byapplying the optimization algorithm of ELM to the frame of non-sparse MKLapproach. Experimental results on the gene expression and UCI dataset have shownthe computational efficiency of the improved algorithm.This paper mainly focuses the computational complexity and the impact of the kernel weights sparseness on the classification performance. GSMKL can not achieve good classification accuary, but also control the sparseness of the kernel weights. What is more, it can improve the computational efficiency and training rate by introducing ELM.
Keywords/Search Tags:multiple kernel learning, classification, sparsity, extreme learning machine, training complexity
PDF Full Text Request
Related items