Information and Interactive Communication

Notions of entropy and information, pioneered by Shannon, have been very powerful tools in coding theory. Coding theory aims to solve the problem of one-way communication: sending a message from Alice to Bob using as little communication as possible, sometimes over a noisy channel.

Communication complexity aims to solve the problem of two-way communication: Alice and Bob aim to implement a functionality f that depends on both parties’ inputs. We will discuss several extensions of information-theoretic notions to the two-way communication setting. We use them to prove a direct sum theorem for randomized communication complexity, showing that implementing k copies of a functionality requires substantially more communication than just one copy, partially settling a long-standing open problem.

More generally, we will show that information cost I(f) can be defined as a natural fundamental property of a functionality f. We will describe several new tight connections between I(f), direct sum theorems, interactive compression schemes, and amortized communication complexity.

Speaker Details

Mark Braverman is an assistant professor of mathematics and computer science at the University of Toronto. He received his PhD in 2008 under the supervision of Stephen Cook, and spent two years as a postdoctoral researcher with Microsoft Research New England in Cambridge, MA. He is the recipient of an NSERC Doctoral Prize and the Canadian Mathematical Society Doctoral Prize. Mark’s major interests are in the areas of computational complexity, algorithms, mechanism design, and applications of information technology in healthcare.

Date:
Speakers:
Mark Braverman
Affiliation:
University of Toronto