Information and Interactive Communication

Speaker  Mark Braverman

Affiliation  University of Toronto

Host  Jennifer Chayes

Duration  01:03:05

Date recorded  10 February 2011

Notions of entropy and information, pioneered by Shannon, have been very powerful tools in coding theory. Coding theory aims to solve the problem of one-way communication: sending a message from Alice to Bob using as little communication as possible, sometimes over a noisy channel.

Communication complexity aims to solve the problem of two-way communication: Alice and Bob aim to implement a functionality f that depends on both parties’ inputs. We will discuss several extensions of information-theoretic notions to the two-way communication setting. We use them to prove a direct sum theorem for randomized communication complexity, showing that implementing k copies of a functionality requires substantially more communication than just one copy, partially settling a long-standing open problem.

More generally, we will show that information cost I(f) can be defined as a natural fundamental property of a functionality f. We will describe several new tight connections between I(f), direct sum theorems, interactive compression schemes, and amortized communication complexity.

©2011 Microsoft Corporation. All rights reserved.
> Information and Interactive Communication