Font Size: a A A

Theory And Applications Of Distance In Hidden Markov Models

Posted on:2005-06-05Degree:MasterType:Thesis
Country:ChinaCandidate:F XieFull Text:PDF
GTID:2120360155471998Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
As a sort of statistical model, the hidden Markov models (HMM) was developed by Baum and his colleagues in the late sixties and early seventies. then it has been gradually applied to many fields such as speech recognition, gene relative analysis and gene recognition, character recognition, image processing, target tracking and signal processing etc. The theory of HMM mainly includs three probloms, which are learning, recognizing and decoding. Learning problem, namely parameter estimation problem, is the core problem of HMM. This paper mainly discusses the theory and applications of distance which closely relates to parameter estimation and clustering probloms in HMM.Baum-Welch algorithm is often made to optimize parameter in parameter estimation problem. In theory, the parameter is optimized in every recursion, but in fact, we cannot calculate it forever. When we will stop the recursion? We need to define a distance for controlling the process of recursion. We also need to define a distance for model classification in clustering problem. The former documents have presented some kinds of significative distance (such as documents[1 —10]), but there isn't a normative definiens. The distance which have been used most often was presented in Juang and Rabiner's documents[2] (for short, J-R distance), At present most of the documents uses this distance or only modifies it a little. But there is two obvious shortcoming in this distance: 1. the distance is calculated by Montor Calor method and the result is random. 2. its calculation is very large and it's convergent rate is very slowly, even difficult to calculate. This paper bases on the former documents which related to the distance in HMM (we also proved Juang and Rabine's intuitionistic discussion strictly ) , and study the distance in HMM from two aspects:the one is to find the more rapid and effective approximate algorithm, the other is to introduce a new and more applied distance. There are three innovates in the article:1 To perfect the the theory of J-R distance and make a new explanation through the relative entropy in information theory.2 Aimming at the shortcoming of the J-R distance which is significative in theory but difficulty to calculate and the result is random,we gain a rapid and effective method for calculating DHMM or CHMM with mixture Gauss density by making use of document[l l].This algorithm was expressed by matrix, and it is convenience to calculate through MATLAB.3 On the basis of presenting and proving a theory on translating DHMM intoHomogeneous Markov Chain, We define a new distance in DHMM by making use of the document[12] which present the distance of Markov Chain in studying the dynamic clustering problem. In contrast to the J-R distance, the new distance is a determinate numerical value and easy to calculate. Finally, we indicate the new distance is appropriate by simulation.
Keywords/Search Tags:HMM, Forward-Backward algorithm, Baum-Welch algorithm, Viterbi algorithm, Kullback-Leibler information measure, Juang-Rabiner measure
PDF Full Text Request
Related items