Font Size: a A A

Information measures for sources with memory and their application to hypothesis testing and source coding

Posted on:2003-04-28Degree:Ph.DType:Thesis
University:Queen's University (Canada)Candidate:Rached, ZiadFull Text:PDF
GTID:2468390011489494Subject:Mathematics
Abstract/Summary:
In this work, we investigate Shannon's and Renyi's information measure rates for finite-alphabet time-invariant Markov sources of arbitrary order and arbitrary initial distributions, along with their application to hypothesis testing and source coding. We also study, using information-spectrum techniques, Csiszar's forward and reverse cutoff rates for the hypothesis testing problem between general sources with memory (including all non-ergodic or non-stationary sources) with arbitrary alphabet (countable or uncountable).;We first provide a computable expression for the Kullback-Leibler divergence rate, limn→infinity 1n D(p( n)∥q( n)), between two Markov sources described by the probability distributions p( n) and q( n), respectively. We illustrate it numerically and examine its rate of convergence. Similarly, we provide a formula for the Shannon entropy rate, limn→infinity 1n H(p( n)), of Markov sources and examine its rate of convergence. As an application to hypothesis testing, we provide an alternative simple proof for Stein's Lemma for testing between stationary irreducible Markov sources.;We also address the existence and the computation of the Renyi alpha-divergence rate, limn→infinity 1n Dalpha(p( n)∥q( n)), between Markov sources, where alpha > 0 and alpha ≠ 1. We provide numerical examples and examine its rate of convergence. We also investigate the limits of the Renyi divergence rate as alpha → 1 and as alpha ↓ 0. Similarly, we provide a formula for the Renyi entropy rate, limn →infinity 1n Halpha(p( n)), of Markov sources. We also study its rate of convergence and its limits as alpha → 1 and as alpha ↓ 0. As an application to source coding, we present a generalization of Campbell's variable-length source coding theorem for discrete memoryless sources to Markov sources. This provides a new operational characterization for the Renyi entropy rate. The main tools used to obtain Shannon's and Renyi's information measure rates results are the theory of non-negative matrices and Perron-Frobenius theory.;We next establish an operational characterization for the Renyi alpha-divergence rate, by showing, using an information-spectrum approach, that the Csiszar forward b -cutoff rate for the hypothesis testing problem between general sources with memory is given by the lim inf alpha-divergence rate with alpha = 11-b . The Csiszar forward b -cutoff rate ( b < 0) for hypothesis testing is defined as the largest rate R0 ≥ 0 such that for all rates 0 < E < R0, the best (i.e., smallest) probability of type 1 error of sample size-n tests with probability of type 2 error ≤ e-nE is asymptotically vanishing as e-nbE-R0 . We also demonstrate that, under some conditions on the large deviation spectrum, the Csiszar reverse b -cutoff rate for the general hypothesis testing problem is given by the lim sup alpha-divergence rate with alpha = 11-b . The Csiszar reverse b -cutoff rate ( b > 0) for hypothesis testing is defined as the smallest rate R0 ≥ 0 such that for all rates 0 < R 0 < E, the best (i.e., largest) correct probability of type 1 of sample size-n tests with probability of type 2 error ≤ e-nE is asymptotically vanishing as e-nbE-R0 . Furthermore, we investigate the important classes of discrete memoryless sources and sources that satisfy the hypotheses of the Gartner-Ellis Theorem for which the forward and reverse b -cutoff rates are computable. Finally, we conclude with observations and remarks along with several possible directions for future work.
Keywords/Search Tags:Sources, Rate, Hypothesis testing, Information, Alpha, Application, Renyi, Infinity 1n
Related items