Font Size: a A A

Research On Sparse Kernel Regression Modelling Methods

Posted on:2015-01-06Degree:MasterType:Thesis
Country:ChinaCandidate:L M WangFull Text:PDF
GTID:2268330428968451Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
There are two significant features in kernel method means using kernels in machine learning:firstly, it constructs a bridge between the linear and nonlinear problems, so that the problem can be solved by using the methods solving linear problems, secondly, the using of kernel functions can avoids the circumstances of the curse of dimensionality, and does not increase the computational complexity. The kernel model’s sparsity is an important criterion to evaluate a kernel model good or bad, so how to construct a sparse kernel model is hot in machine learning field. There are two main strategies to establish a kernel model:convex optimization method and greedy method. Using convex optimization method can get the only one global optimal solution and wouldn’t get the local optimal solution. The representative model using convex optimization is the support vector machines.The greedy method can get the optimal solution very fast, which produces the optimal solution is not always the global optimal solution, is the global optimal solution or the approximation of global optimal solution. Kernel matching pursuit, projection pursuit and orthogonal least squares, the three models are the examples that use the greedy method. SVM showed excellent performances in solving small size samples, nonlinear and high dimensional pattern recognition problems. Compared with support vector machine models, the kernel matching pursuit has equivalent performance and better sparsity and less computational complexity. The orthogonal least squares model is a simple and high-efficiency model, and it has the characteristics of non-linear model with linear weight, good generalization performance and sparsity. Currently humans usually construct a kernel model using only one type kernel function, but it is inadequate for some data that the data has different structures, so we will use mixed kernels for this question.We constuct a regression model using the orthogonal least squares method and mixed kernels and use the repeated weighted boosting research to speed up the search.lt has been proved that the repeated weighted boosting search is a simple and effective global optimization search algorithm, so the model we constructed by using the orthogonal least squares regression, tunable mixed kernels and repeated weighted boosting search has a good sparsity and generalization performance. Orthogonal least squares with tunable mixed kernels model is based on the idea of greedy method, so the model’s optimal solution we getted is probably not the global optimal solution, just a approximation of the global optimal solution. So considering the idea of decision tree to improve orthogonal least squares tunable kernels model, we can construct a tree to save several local optimal solutions to approach the global optimal solution to have a balance between the global optimal solutions and local optimal solutions.By using tree structure, the model have a better sparsity and generalization performance.
Keywords/Search Tags:machine learning, kernels, orthogonal least squares, repeatedweighted boosting search, decision tree
PDF Full Text Request
Related items