Font Size: a A A

Fast Multiple Kernel Learning For Classification And Application

Posted on:2016-07-11Degree:DoctorType:Dissertation
Country:ChinaCandidate:T SunFull Text:PDF
GTID:1108330464962881Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
Kernel learning is a very popular non-linear classification method. It constructs kernel matrix to depict the similarity between samples in the high dimensional space. Multiple kernel learning(MKL) is an extension of kernel learning. Compared with single kernel learning, MKL puts multiple sub-kernels in a unified optimization frame and seeks a best combination of these sub-kernels. MKL can avoid the adjustment of kernel parameter and makes learning procedure more automatically. If a sub-kernel corresponds to single feature and dimension, MKL can be used to solve feature selection and dimensionality reduction problems. But the high computation complexity costs MKL a lot of time during the training procedure, which makes MKL impractable in the real application. To address this problem, two novel fast MKL methods are propsed. The first one is designed to pre-select beneficial sub-kenrels before MKL optimization. The second one constructs sub-kernel matrixes with random kernel function which reduces the sub-kernel scale. Besides that, by the knowledge from MKL theory research, MKL is successfully applied to hyperspectral image and visual image classification.The main contributions can be summarized as follows:(1) To address the high computation complexity of MKL, a Selective Multiple Kernel Learning(SMKL) method is proposed. By theoretically analysis, it is found that MKL can be deemed as a special form of Ensemble Learning(EL). Therefore, MKL can use ensemble strategy to pre-select some high discriminative and large diverse sub-kernels before MKL optimization. For evaluating the discrimination and diversity of sub-kernels, a new kernel evaluation method is designed. Compared with classical kernel evaluation method: Kernel Alignment(KA), the new method can provide quantified and more accurate results. By pre-selecting beneficial sub-kernels, SMKL saves memory and accelerates the training procedure. Especially, it can be combined with MKL having L∞-norm constraint and saves computation time and memory largely. The whole computation cost equals running one time of single kernel learning. Lots of experiments demonstrate that the proposed method requires less time, memory and obtains comparable or higher classification accuracy than traditional MKL methods.(2) For reducing the time of constructing sub-kernel matrixes, a novel random kernel based MKL method is proposed, inspired from Extreme Learning Machine ELM. Guaranteeingenough number of latent nodes in a Single-hidden Layer Feedforward Neural Network SLFN and an activation function with infinitely differentiable in any interval, SLFN can approximate input samples with zero error, where the parameters of SLFN can be randomly chosen from any intervals. Due to the random weights in the activation function, ELM is a non-parametric learning machine. If the activation function of ELM is used to construct the sub-kernels of MKL, it will largely reduce the candidiate sub-kernel scale. Additionally, it can be combined with SMKL and further acclearate the sub-kernel construction time and training time. Experiments instate that proposed method uses faster speed, less memory and gets better results than existing fast MKL method.(3) MKL is applied to hyperpsectral remote sensing image classification. For the high dimensionality of hyperspectral image, a novel two stage MKL(TMKL) is proposed to reduce dimensionality. By using Rademacher complexity to analyze the general error of MKL, it is found that the supremum of general error bound rises with the increase of sub-kernels. Since hyperspectral image owns high dimensionality, it produces lots of sub-kernels. Hence, directly applying MKL to dimensionality reduction of hyperspectral image is not reasonable. In this paper, a new two stage MKL is designed. By pre-selectiong sub-kernels beneficial for classification, it reduces the sub-kernels to be optimized and obtains the better results than MKL. Experiments on several hyperspectral images indicate that TMKL acquires better classification performance than classical feature selection methods.(4) MKL is applied to imbalanced classification of hyperspectral remote sensing images. When some classes have much fewer pixels than other classes in a classification task, traditional classification methods are not appropriate because they are prone to assign all the pixels to the classes with large number of pixels. For such imbalanced problem, ensemble learning is a good method. But existing ensemble schemes are independent of classifiers, which will not get the best performance for a certain classifier. Due to the high dimension and small sample problem, support vector machine SVM is a very popular classifier in hyperspectral image classification. Since SVM adopts maximum margin as the classification criterion, the criterion of SVM, maximum margin, is adopted to guide the ensemble learning procedure for imbalanced hyperspectral image classification. Experiments state that our method obtains higher classification accuracy than representative imbalanced classification methods for hyperspetral images.(5) MKL is applied to visual image classification. Since the fixed partition type of spatial pyramid decomposition SPD is too single for diverse categories, a flexible spatial pyramid decomposition(FSPD) method is proposed. SPD implies the spatial position information of features and has the better classification performance. FSPD is not limited to the orthogonal partition. It can partition image arbitrarily. Hence, how to choose the best partition method from diverse partitiom methods is very important. The quantified kernel evaluation method proposed at the first work is used to evaluate different partition methods. Then a clonal selection algorithm(CSA) is used to search proper FSPD for each category. The experiments on several visual image sets indicate that FSPD has a larger advantage than traditional SPD.
Keywords/Search Tags:Multiple kernel learning, kernel evaluation, hyperpspectral image classification, imbalanced classification, visual image classificaiton
PDF Full Text Request
Related items