Font Size: a A A

Least Squares Kernel Ensemble Learning

Posted on:2020-12-15Degree:DoctorType:Dissertation
Country:ChinaCandidate:Dickson Keddy WornyoFull Text:PDF
GTID:1368330623961212Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Recently,there have been advancements in the growth of machine learning.As a result,the least squares has received considerable attention for the last few years due to its simplicity in problem formulation and implementation.Although the least squares models have an excellent performance in classification and regression,they are sensitive to parameters setting.This challenge informed researchers to pay more attention to the single model approach.The effective way to solve this problem is the introduction of the ensemble model.In this dissertation,the background of the study and the application areas of least square methods are outlined primarily.After which the research status is discussed briefly,and the final preprocessing aspect of the least square method is introduced.Throughout this current research,significant progress has been made in its application with kernel ensemble learning.This is achieved by putting forward co-regularized kernel ensemble regression,coupled least square support vector ensemble machine,and sample-induced factorization kernel ensemble regression.The main contents are as follows:1)Co-regularized kernel ensemble regression scheme is brought forward.In the scheme,multiple kernel regressors are absorbed into a unified ensemble regression framework simultaneously and co-regularized by minimizing the total loss of ensembles in Reproducing Kernel Hilbert Space.In this way,one kernel regressor with more accurate fitting precession on data can automatically obtain bigger weight,which leads to better overall ensemble performance.Compared with several single and ensemble regression methods such as Gradient Boosting,Tree Regression,Support Vector Regression,Ridge Regression,and Random Forest,our proposed method can achieve the best performances of regression and classification tasks on several UCI datasets.2)A novel coupled least squares support vector ensemble machine(C-LSSVEM)was proposed.The proposed coupling ensemble helps improve robustness and produces excellent classification performance than the single model approach.The proposed C-LSSVEM can choose appropriate kernel types and their parameters in a good coupling strategy with a set of classifiers being trained simultaneously.The proposed method can minimize the total loss of ensembles in kernel space.In this way,base kernel regressors are co-optimized and weighted to form an ensemble regressor.Extensive experiments conducted on several datasets such as artificial datasets,UCI regression datasets,UCI classification datasets,handwritten digits datasets,and NWPU-RESISC45 datasets;indicate that C-LSSVEM performs better in achieving the lowest regression loss and the highest classification accuracy as compared to the state-of-the-art regression and classification methods.3)We further then propose a novel sample-induced factorization kernel ensemble regression(SIFKER)to solve the deficiency in differentiating between relevant data points and outliers.In this sample-induced framework,we introduced a matrix D to consider both data distribution and penalty weights recover outlier's free global structures from missing and noisy data points.To evaluate the quality of each base regressor and make a good selection of well-behaved regressors,we introduce sample-induce factorization into the loss function to decrease the weight of badly behaved regressors in our proposed model.From experimental results on different UCI and computer vision datasets show that our proposed method has the advantage of superior performances in keeping lower regression results and higher classification results than the other state-of-the-art methods.
Keywords/Search Tags:Ensemble Regression, Multi-Kernel Learning, Kernel Regression, Least Squares Support Vector Machines, Classification
PDF Full Text Request
Related items