Font Size: a A A

Information theory methods in communication complexity

Posted on:2013-11-20Degree:Ph.DType:Dissertation
University:Rutgers The State University of New Jersey - New BrunswickCandidate:Leonardos, NikolaosFull Text:PDF
GTID:1458390008974627Subject:Computer Science
Abstract/Summary:
This dissertation is concerned with the application of notions and methods from the field of information theory to the field of communication complexity. It consists of two main parts.;In the first part of the dissertation, we prove lower bounds on the randomized two-party communication complexity of functions that arise from read-once boolean formulae. A read-once boolean formula is a formula in propositional logic with the property that every variable appears exactly once. Such a formula can be represented by a tree, where the leaves correspond to variables, and the internal nodes are labeled by binary connectives. Under certain assumptions, this representation is unique. Thus, one can define the depth of a formula as the depth of the tree that represents it. The complexity of the evaluation of general read-once formulae has attracted interest mainly in the decision tree model. In the communication complexity model many interesting results deal with specific read-once formulae, such as disjointness and tribes. In this dissertation we use information theory methods to prove lower bounds that hold for any read-once formula. Our lower bounds are of the form n(f )/cd(f) , where n(f ) is the number of variables and d( f ) is the depth of the formula, and they are optimal up to the constant in the base of the denominator.;In the second part of the dissertation, we explore the applicability of the information-theoretic method in the number-on-the-forehead model. The work of Bar-Yossef, Jayram, Kumar & Sivakumar [BYJKS04] revealed a beautiful connection between Hellinger distance and two-party randomized communication protocols. Inspired by their work and motivated by the open questions in the number-on-the-forehead model, we introduce the notion of Hellinger volume. We show that it lower bounds the information cost of multi-party protocols. We provide a small toolbox that allows one to manipulate several Hellinger volume terms and also to lower bound a Hellinger volume when the distributions involved satisfy certain conditions. In doing so, we prove a new upper bound on the difference between the arithmetic mean and the geometric mean in terms of relative entropy. Finally, we show how to apply the new tools to obtain a lower bound on the informational complexity of the ANDk function.
Keywords/Search Tags:Information, Complexity, Methods, Lower, Dissertation
Related items