Font Size: a A A

Multi-output Regression Model With Tunable Kernel Based On Orthogonal Least Squares

Posted on:2017-05-10Degree:MasterType:Thesis
Country:ChinaCandidate:C Z YuFull Text:PDF
GTID:2308330488485662Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Kernel machine is a hot topic in the field of machine learning. The idea of kernel method is that the data is implicitly mapped to high dimensional feature space through the kernel substitution and kernel function is used to replace the inner product of the nonlinear mapping to solve the non-linear problem. Both the sparsity and accuracy of kernel model are the basic principle to evaluate the method. So the focus of research work is devoted to constructing a sparse kernel model. In recent years, more and more kernel function model is applied to the data modelling, such as support vector machine (SVM), linear regression, etc. However, these methods are based on fixed scale model, and a fix and unchanged scale is applied to every terms in the kernel function model. When the input signal contains both flat and non-flat parts the kernel function model with fixed scale need to use more terms to model the data, thus affect the generalization ability of the model.To improve the sparsity of constructing model, this paper proposes a scale-tunable kernel function model. Each term in the model applies a different and tunable scale. So this model is more flexible than the traditional model with fixed scale. Multi-output model has more extensive application fields and practical application background, so the new model is applied to the multi-output problem. The new model constructs model through orthogonal least squares (OLS) algorithm. When selecting each regression of kernel function parameters, group search optimization algorithm is used to optimize the residual objective function. OLS is a fast method to construct sparse network, mainly used in nonlinear system model. Because this algorithm is simple and effective, and the resulting model has lower complexity and better generalization performance, it becomes a research hotspot in the field of machine learning and intelligent control. Group search optimization (GSO) is applied to optimize the parameters at earch regressor. GSO is a random search optimization algorithm based on population groups, inspired by animals behavior, especially animals searching behavior. Based on this framework, concepts from animal searching behavior, e.g., animal scanning mechanisms, areemployed metaphorically to design optimum searching strategies for solving continuous optimization problems. GSO has the advantages of simple structure and fast convergence speed, especially the application in the high dimensional optimization problems, such as neural network building.The experiments were performed on both artificial and real-world datasets. The experimental results show that the newly-proposed algorithm can produce a sparse models compared with the traditional kernel machines, such as multiple output support vector machines and revelent vector machines.
Keywords/Search Tags:machine learning, kernel function, sparse, multi-output, orthogonal least squares, group search optimization
PDF Full Text Request
Related items