Font Size: a A A

Study Of Least Squares Support Vector Regression

Posted on:2019-08-25Degree:MasterType:Thesis
Country:ChinaCandidate:Y T TongFull Text:PDF
GTID:2428330548487461Subject:Statistics
Abstract/Summary:PDF Full Text Request
In this thesis,we mainly consider the least squares support vector regression,which inherits the advantages of support vector machine such as small sample property,high dimensionality and nice generalization ability.However,it is lack of sparsity and robustness because the least squares loss function is used in this algorithm.On the other hand,the parameter selection of the machine learning algorithms directly affects the prediction ability of the model.In this thesis,we improve the least squares support vector regression from three aspects:sparsity,robustness and parameter selection,and propose some a new algorithm.Our main contributions are as follows.(1)Most methods to improve the sparsity of the least squares support vector regression are implemented in the original space by selecting support vectors or considering feature vectors.In this thesis,we propose the density based iterative least squares support vector regression.Firstly,we map the sample into a high dimensional feature space which makes the hidden information in the original space displayed in the high dimensional feature space.Then we calculate the distance between the class and the hyperplane to identify the sample points located on the regression hyperplane boundary.For the data points which is not on the regression hyperplane boundary,we cluster them based on the density of those sample and iterate the algorithm until the training set contains the main information of the regression.This method can not only improve the sparsity of the least squares support vector regression by compressing training set,but also keep the nice prediction ability of the algorithm.This is the first novelty of this thesis.(2)The weighted least squares support vector regression proposed by Suykens has poor performance on outliers and the weights cannot vary with the distribution of fitting error and outliers.We introduce IGG? weight function which is more robust on the outliers.Meanwhile,the weights can be adjusted by the density iterative least squares support vector regression.This is the second novelty of this thesis.(3)The particle swarm optimization used to search the optimal parameters of the model is likely to get local optimum.In this thesis,we take the inertia weight factor changing with the fitness value,which makes some particles search with a small step around the local optimal solution,and other particles search with larger step.This method can promote the particles jump out of the local optimum,This is the third novelty of this thesis.(4)The improved least squares support vector regression algorithm is used to simulate on five UCI standard datasets.Numerical analysis shows that our method proposed in this thesis has a significant improvement in the accuracy of prediction and the training time.In addition,the new method proposed in this thesis is applied to the air quality prediction problem.The results show that our algorithm is outperforms SVM,neural network and decision tree in terms of prediction accuracy and training time.
Keywords/Search Tags:Statistical Learning Theory, Support Vector Machine, Least Squares Support Vector Regression, Sparsity, Robustness
PDF Full Text Request
Related items