Font Size: a A A

Research On Adaboost Classification Algorithm

Posted on:2019-07-30Degree:MasterType:Thesis
Country:ChinaCandidate:H WuFull Text:PDF
GTID:2428330596460822Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
Pattern classification is the basic problem of artificial intelligence,which has been widely used in many fields such as defense,security,manufacturing,transportation,finance,environment,and medical care.In recent years,it has been paid more and more attention due to development of social production and livelihood services.Pattern classification is to mapping data to a given category by constructing a classification model.Pattern classification problems are generally solved from aspects of features and classification.Feature is the basis for training the classifier,which affects models' performance together with classification algorithm.Adaboost is a classification algorithm with feature selection ability,whose advantages are high accuracy and fast speed.Therefore,the paper focuses on the feature selection and classification algorithm design in pattern classification problems based on Adaboost.Feature extraction.The main ideas and algorithm implementation of Haar features and Gabor features are studied.From the aspects of computational cost and so on,the advantages and disadvantages of these two kinds of features are compared and analyzed.Considering practical application requirements,Haar features are used for the two-class recognition problem while Gabor features is used for multi-class recognition problems.Two-class recognition problem.The basic idea of Adaboost algorithm is studied and the feature selection mechanism is then analyzed.To deal with the imbalance between positive and negative samples in object detection problem,it is studied that how to use the hard-cascade classifier to detect objects.To overcome the limitations of the hard-cascade structure,the paper focuses on the soft-cascade structure and proposes a new classifier training algorithm,which involves feature selection and threshold calibration.In feature selection,candidate set scanning is applied to optimizes the sampling process.In threshold calibration,direct back pruning is applied to improve predict speed.Head-shoulder image sets and license plate image sets are used for target detection experiments.The performance of the hard-cascade classifier and the soft-cascade classifier was compared and analyzed.The results show that the soft-cascade classifier performs better in both accuracy and speed.Multi-class recognition problem.Based on the one-vs-all classification mode,the GAB.OVA algorithm is studied to explain how to combine the two-class classifiers into a multi-class classifier.The prediction model of GAB.OVA algorithm is analyzed.To overcome the limitation that the number of features increases linearly with the number of categories,a weak classifier called composite stumps is introduced,and a new multi-classification algorithm BCS is proposed.The BCS algorithm applies feature sharing among classes,eliminates the linear relationship between numbers of features and categories,and so reduces the complexity of prediction model.MNIST handwritten character datasets and license plate character datasets is used for multi-class recognition experiments,the results show that BCS algorithm has lower computation cost and no decrease in accuracy.
Keywords/Search Tags:pattern classification, Adaboost, cascade classifier, feature selection, feature sharing
PDF Full Text Request
Related items