Font Size: a A A

Robust ensemble classifiers and their applications to landmine detection

Posted on:2005-12-15Degree:Ph.DType:Thesis
University:University of FloridaCandidate:Sun, YijunFull Text:PDF
GTID:2458390008483854Subject:Engineering
Abstract/Summary:
AdaBoost is one of the most important recent developments in the classification methodology. AdaBoost works by repeatedly applying the base learning algorithm to the re-sampled versions of the training data to produce a collection of hypothesis functions which are finally combined via a weighted linear vote to form the final decision. Under mild assumptions, AdaBoost can lead to a classification algorithm with arbitrary accuracy. By pursuing a large norm-1 margin, AdaBoost can also significantly improve the generalization performances in many cases. However, recent studies showed that AdaBoost performs poorly on noisy data. In this work we present several new regularized boosting algorithms to mitigate the overfitting problem of AdaBoost. Our regularized algorithms are directly motivated by the connection between AdaBoost and linear programming. They implement an intuitive idea of controlling the distribution skewness in the learning process to prevent outlier samples from spoiling decision boundaries by introducing a smooth convex penalty function into the objective function of the minimax problem. Large-scale experiments based on UCI (University of California, Irvine), DELVE (Data for Evaluating Learning in Valid Experiments), STATLOG and USPS (US Postal Service) datasets are conducted. For the UCI, DELVE and STATLOG datasets, we show that our regularized boosting algorithms can achieve at least the same or much better performance than other regularized AdaBoost algorithms. For the USPS datasets, we show that our algorithms are very robust against class mislabeling and feature noise. We also extend our analyses to multiclass problems. Particularly, two multiclass AdaBoost algorithms: AdaBoost.MO and AdaBoost.ECC are investigated. We prove that both algorithms can be categorized into the family of stagewise functional gradient descent algorithms. Based on the different margin definitions, two new regularized multiclass AdaBoost algorithms are also proposed.; We also consider landmine detection via forward-looking ground penetrating radar (FLGPR) by using time-frequency analysis and AdaBoost. Our task is to detect the presence of landmines in radar images. We formulate it as an object recognition problem. Two main challenges are: (1) how to extract intricate structures of target signals from radar imageries and (2) how to adapt a classifier to surrounding environments through learning. (Abstract shortened by UMI.)...
Keywords/Search Tags:Adaboost, Algorithms
Related items