Font Size: a A A

Research On Nonlinear Regression Model Based On Logistic Product Basis Network With Orthogonal Least Squares Algorithm

Posted on:2017-07-10Degree:MasterType:Thesis
Country:ChinaCandidate:R WangFull Text:PDF
GTID:2348330488985662Subject:Computer technology
Abstract/Summary:PDF Full Text Request
In recent decades, machine learning has experienced a rapid development, which has been widely applied in many fields, such as data mining, artificial intelligence, health care and so on. As one of the most important subjects in machine learning, the research on the regression problem has drawn great attention from both science and industry.Unlike the linear regression, which is simple and easy to deal with, the nonlinear regression presents an extremely challenging research topic because it needs to model the complex nonlinear relationship between the dependent variable and multiple characteristic attributes. The nonlinear regression problem is considered as one of the most important research topic in machine learning area. Among them, the performance improvement of the model is an important research topic. In the existing literature, the main method to achieve optimal performance mode includes the selection of the model structure, parameter estimation methods and the pre-processing methods.The logistic product basis network (LPBN) was recently proposed as an excellent nonlinear regression tool. LPBN contains a deeper structure than the standard single-hidden-layer logistic perceptron, in which a linear combination of logistic functions product basis are optimized by the gradient decent (GD) algorithm. However, GD type algorithms are computationally expensive (slow) when training the standard neural networks with logistic neurons. This situation becomes even worse for training LPBN for it includes many more parameters to be optimized than logistic perceptron does.This thesis studies in constructing of nonlinear models, which contains two ain aspects, that is, the model structure and parameter estimation method. Specially, the LPBN is modified by orthogonal least squares(OLS). OLS is employed to decompose the LPBN model into a number of sub-model, and optimize the model by greedy algorithm. So at each iteration, only a model with a small size needs to be optimized. Repeated weighted boosting search(RWBS) is applied to tune parameters at each individual regression stage. It is simple, effective and easy to program. Besides, RWBS is considered to be a global search algorithm. In this way, the new algorithm will hold a fast convergence rate.The experiments were performed on both artificial and real-life datasets, which can be publicly downloaded from UCI machine leaning repository. Compared with support vector machine, relevance vector machine and the standard LPBN, the experimental results show that the new algorithms can improve the convergence speed for the high dimensional regression problems, and make the model more sparse.
Keywords/Search Tags:nonlinear regression, orthogonal least squares, repeated weighted boosting search algorithm, neural network, K-means clustering
PDF Full Text Request
Related items