Revised: January 2, 2018
Published: December 27, 2018
Abstract: [Plain Text Version]
We give an example of a boolean function whose information complexity is exponentially smaller than its communication complexity. Such a separation was first demonstrated by Ganor, Kol and Raz (J. ACM 2016). We give a simpler proof of the same result. In the course of this simplification, we make several new contributions: we introduce a new communication lower-bound technique, the notion of a fooling distribution, which allows us to separate information and communication complexity, and we also give a more direct proof of the information complexity upper bound.
We also prove a generalization of Shearer's Lemma that may be of independent interest. A version of Shearer's original lemma bounds the expected mutual information of a subset of random variables with another random variable, when the subset is chosen independently of all the random variables that are involved. Our generalization allows some dependence between the random subset and the random variables involved, and still gives us similar bounds with an appropriate error term.
A preliminary version of this paper appeared in ECCC as Technical Report TR15-057.