Font Size: a A A

Average Distribution Integration Strategy: A New Method Of Fusion Of Classifier

Posted on:2017-04-14Degree:MasterType:Thesis
Country:ChinaCandidate:B ZhuFull Text:PDF
GTID:2278330482497642Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
One of the core target of machine learning is to make the machine has the same capability of autonomic learning as intelligent creatures. Today, machine learning has become the core of research in artificial intelligence, and its application nearly cover all research in artificial intelligence, including expert systems, automated reasoning, natural language processing, pattern recognition and computer vision.In each branch of machine learning, ensemble learning has been a major research direction, with significant growth in the last two decades. Ensemble learning studies many different classifiers, then fusion their prediction results as its final classification result. Compared with single classifier, ensemble learning in most cases can significantly improve the generalization ability of the classification system, and has a stronger robustness and stability.Classifier fusion is one of the key questions of ensemble learning and a variety of classifier fusion methods have been proposed. This dissertation gives a relatively deep study of ensemble learning, and the concept, structure, function and new progress of ensemble learning are introduced. On this basis, this study proposes the concept of "equivalent distribution of ensembles". Through adjustment of weights of base classifiers, we expect that the ensemble classifier has equivalent performance on different samples. This strategy enables these samples that are only correctly classified by minority of the base classifiers to be correctly predicted by the ensemble classifier. Additionally, we propose the concept of "equivalent coefficient", which is employed to measure the weights of multiple base classifiers of an ensemble. We have performed a set of experiments on 12 UCI datasets and strictly follow the 10-fold cross validation scheme. Experimental results demonstrate that the proposed equivalent distribution of ensemble algorithm outperforms existing simple majority voting, LP-Adaboost algorithms and LP1 algorithms. Therefore, we believe "Equivalent Distribution of Ensemble" is an effective classifier fusion method.
Keywords/Search Tags:Machine learning, Artificial intelligence, Ensemble learning, Classifier fusion, Equivalent distribution
PDF Full Text Request
Related items