Font Size: a A A

Vector Gaussian Multi-Terminal Source Coding

Posted on:2017-05-12Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y F XuFull Text:PDF
GTID:1108330491462513Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
Multiuser information theory has come to the forefront of the scientific commu-nity for more than five decades. In this dissertation, the multi-terminal source coding problem in the vector Gaussian setting is investigated. The major efforts are devot-ed to characterize the rate distortion regions of some fundamental models, such as distributed source coding problem and multiple description coding problem.First, we re-derive the rate distortion region of the vector Gaussian one-help-one problem. For vector Gaussian source, the major difficulty to overcome is that the entropy power inequality used in scaler Gaussian case is not necessarily tight, our proof does not apply the enhancement technique as opposed to the original proof by Rahman and Wagner. In this paper, we present the entropy using Fisher information matrix, and derive a new extremal inequality based on the method of integration over a path of a continuous Gaussian perturbation. We then apply this extreme in-equality to characterize the entire rate region of the vector Gaussian one-help-one problem.Second, we characterize the rate region of the vector Gaussian CEO (Chief Ex-ecutive Officer) problem with total average quadratic distortion. We develop a new analysis technique based on spectral decomposition of mean square error in Berger-Tung scheme in order to prove the converse part of the rate distortion region, in which the perturbation method is utilized through combined with detailed analysis of Karush-Kuhn-Tucker necessary conditions in non-convex optimization problem. This enables us to integrate the perturbation argument by Wang and Chen with the distortion projection method by Rahman and Wagner.Finally, we introduce the problem of multiple description coding with the tree-structured distortion constraints. In particular, a single-letter lower bound on the minimum sum rate is derived. For the vector Gaussian source setting with the co-variance distortion measure constraints, it is shown that this lower bound coincides with the generalized El Gamal-Cover scheme. To establish the lower bound, the key component is to generalize Ozarow’s argument to introduce a Markov tree expan-sion of the original probability space, via using multiple auxiliary random variables.
Keywords/Search Tags:CEO problem, data compression, extremal inequality, distributed source coding, multiple description coding, Markov tree, multiuser information the- ory, one-help-one problem, rate distortion theory, remote source, vector Gaussian source
PDF Full Text Request
Related items