Font Size: a A A

The Improvement Of Adaboost Algorithm Base On The Self-adaption Of Weak-classifier After Cluster

Posted on:2017-05-31Degree:MasterType:Thesis
Country:ChinaCandidate:K F YaoFull Text:PDF
GTID:2428330590491683Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this paper,we propose a so-called DC-AdaBoost integration algorithm,and by way of combination algorithm improve the SVM algorithm(the ramp loss algorithm under the constraint of l1+l2).At the early stage of the ensemble learning,consider joining the training data clustering process,according to the clustering of late AdaBoost algorithm iteration and weights of trade-off optimization,according to the test set late on clustering of the data points belong to different,change each value of the weak classifier of the voting rights.Using this technology can achieve data local optimal learning weak classifier voting fusion,rather than the overall weak classifier voting fusion,and sensitivity to noise problem gives a solution.In the process of comparing final algorithm with LR,AdaBoost and improved SVM,on the performance and efficiency are significantly better than AdaBoost.We can also use this method in RF and GBDT algorithm,and also can be extended to include bagging and bootstrap integration thought of each algorithm.
Keywords/Search Tags:K-means, AdaBoost, Dynamic weight adjustment
PDF Full Text Request
Related items