Font Size: a A A

Research On Multi Kernel Support Vector Machine

Posted on:2020-06-26Degree:MasterType:Thesis
Country:ChinaCandidate:B Y ZangFull Text:PDF
GTID:2428330590478386Subject:Control Engineering
Abstract/Summary:PDF Full Text Request
Support Vector Machine(SVM)proposed by scholar Vapnik is a machine learning method based on statistical learning theory(SLT).Vapnik and other scholars used the principle of structural risk error minimization to replace the traditional empirical risk error to construct the SVM model.Since the introduction of SVM,it has been widely concerned and studied by scholars at home and abroad.A number of improved models have been developed based on the SVM model,such as Twin Support Vector Machine(TWSVM),Support Vector Regression(SVR),Least Squares Support Vector Machine(LSSVM)etc.SVM was originally constructed in the case of linear classification.For nonlinear problems,kernel functions are used to map data to high-dimensional space for training.Common kernel functions are as follows: linear kernel,polynomial kernel,radial basis kernel,etc.,but these are single-kernel training for data,and is not suitable for dealing with some heterogeneous high-dimensional complex data.Some scholars have proposed a machine learning method called Multi-Kernel Learning(MKL),which uses multiple different kernel organic combinations to achieve feature mapping of different dimensions of training data to achieve better training results.In this paper,facing the problems of SVM and its improved algorithm in the case of complex data training,the specific research contents are as follows:1.A multi-kernel Sparse Least Squares Support Vector Regression(MKS-LSSVR)based on compressed sensing is proposed.Compared to standard SVM,LSSVR lacks sparsity because it uses all data as a support vector for training.Observing the model of LSSVR,it is concluded that it is consistent with the optimization target form of Compressive Sensing(CS)theory.CS reconstruction algorithm is a very efficient sparse algorithm.Here,the CS reconstruction algorithm is used to solve the optimization function of LSSVR,and the sparse solution is obtained.The training model under the multi-kernel model is obtained by the expansion kernel method to solve and train,and the advantages and disadvantages of different models are compared.2.This paper proposes Multi-Scale kernel Least Squares Support Vector Regression(LSSVR).It can be known from wavelet decomposition theory that any function can be composed of a linear combination of wavelet functions.Therefore,for LSSVR,it is decomposed into a large function and a small function,and the objective function is regressed,and then iterative approximation is performed,and regression accuracy can be improved finally.3.The wavelet kernel function TWSVM is proposed and extended to LSTSVM.By combining the TWSVM model of the wavelet kernel,the problem of singularity that may occur in the TWSVM solution process is avoided.At the same time,the training accuracy of the model for heterogeneous high-dimensional data is improved.A wavelet kernel is a cluster of functions that can approximate an arbitrary function.The kernel function matrix,which was constructed with wavelet kernel function,has orthogonal or nearly orthogonal characteristics,avoiding the problem that the matrix cannot be invertible,and can retain more data sample distribution details.
Keywords/Search Tags:support vector machine, least squares, function regression, multi-kernel learning, compressed sensing
PDF Full Text Request
Related items