Font Size: a A A

Automatic feature learning and parameter estimation for Hidden Markov models using MCE and Gibbs sampling

Posted on:2010-07-18Degree:Ph.DType:Dissertation
University:University of FloridaCandidate:Zhang, XupingFull Text:PDF
GTID:1448390002479255Subject:Engineering
Abstract/Summary:
Hidden Markov models (HMMs) are useful tools for landmine detection using Ground Penetrating Radar (GPR), as well as many other applications. The performance of HMMs and other feature-based methods depends not only on the design of the classifier but also on the features. Few studies have investigated both classifiers and feature sets as a whole system. Features that accurately and succinctly represent discriminating information in an image or signal are very important to any classifier. In addition, when the system that generated those original images has to change to fit different environments, the features usually have to be modified. The process of modification can be laborious and may require a great deal of application domain specific knowledge. Hence, it is worthwhile to investigate methods of automatic feature learning for purposes of automating algorithm development. Even if the discrimination capability is unchanged, there is still value to feature learning in terms of time saved.;In this dissertation, two new methods are explored to simultaneously learn parameters intended to extract features and learn parameters for image-based classifiers. The notion of an image is general here. For example, a sequence of time or frequency domain features. We have developed a generalized, parameterized model of feature extraction based on morphological operations. More specifically, the model includes hit-and-miss masks to extract the shape of interests in the images. In one method, we use the minimum classification error (MCE) method with generalized probabilistic descent algorithm to learn the parameters. Since our model is based on gradient decent methods, the MCE method cannot guarantee a global optimal solution and is very sensitive to initialization. We propose a new learning method based on Gibbs sampling to learn the parameters. The new learning method samples parameters from their individual conditional probability distribution instead to maximize the probability directly. This new method is more robust to initialization, and can generally find a better solution.;We also developed a new learning method based on Gibbs sampling to learn parameters for continuous hidden Markov models with multivariate Gaussian mixtures. Because hidden Markov models with multivariate Gaussian mixtures are commonly used HMM models in applications, we propose a learning method based on Gibbs sampling. The proposed method is empirically shown to be more robust than comparable expectation-maximization algorithms.;We performed experiments using both synthetic and real data. The results show that both methods work better than the standard HMM methods used in landmine detection applications. Experiments with handwritten digits are also presented. The results show that the HMM-model framework with the automatic learning feature algorithm again performed better than the same framework with the man-made feature.
Keywords/Search Tags:Hidden markov models, Feature, Learn, Gibbs sampling, Automatic, MCE, Using
Related items