Font Size: a A A

The Improvement And Application Of Adaboost Algorithm

Posted on:2018-05-14Degree:MasterType:Thesis
Country:ChinaCandidate:Z Z ZhangFull Text:PDF
GTID:2348330533457201Subject:Mathematics · Application statistics
Abstract/Summary:PDF Full Text Request
The Adaboost algorithm is mainly used to deal with classification problems and regression problems.The core of the Adaboost algorithm is using the base classifiers with iterative procedures to generate a strong classifier by using the weighted combination.Compared with other machine learning algorithms that we have known,the Adaboost algorithm can effectively avoid the problem of overfitting.In this paper,the Adaboost algorithm is used for classification.In the classical Adaboost algorithm,the weight of the base classifier is calculated from the error rate of the training subset by the base classifier,and is given a fixed weight,that is,the final classification is unique.When the test set is not similar to the training sample,the training of strong classifier may not have a good generalization ability,resulting in that the final classification results can not achieve our desired effect.This paper presents the idea of generating different strong classifiers based on different test samples.The main idea is that,firstly,the training set is clustered by K-means++ algorithm.Secondly,the similarity between the test sample and each group is calculated by Euclidean distance.And at the same time,the weight of each group is calculated by the error rate.Finally,combining similarity and weight,we get weighted combination to generate the final strong classifier.In this paper,two data sets are selected for empirical analysis.We found that the improved Adaboost algorithm is better than the classical Adaboost algorithm.
Keywords/Search Tags:Adaboost algorithm, K-means++ algorithm, DY-Adaboost algorithm, dynamic, cluster
PDF Full Text Request
Related items