Dissipation of Information in Channels with Input Constraints

One of the basic tenets in information theory, the data processing inequality, states that output divergence does not exceed the input divergence for any channel. For channels without input constraints, various estimates on the amount of such contraction are known, e.g., Dobrushin’s coefficient for the total variation. This work investigates channels with average input cost constraint. It is found that while the contraction coefficient typically equals one, the information nevertheless dissipates. A certain non-linear function, a Dobrushin curve of the channel, is proposed to quantify the amount of dissipation. Tools for evaluating the Dobrushin curve of additive-noise channels are developed. Applications in stochastic control, uniqueness of Gibbs measures and noisy circuits are discussed.

Speaker Details

Yury Polyanskiy is an Assistant Professor of Electrical Engineering and Computer Science in the Dept. of EECS at MIT and a member of LIDS. Previously he was a postdoc at Princeton University hosted by Sergio Verdú, with whom he worked on various topics in information theory. In 2013 he received an NSF career award. For more information see http://people.lids.mit.edu/yp/cv.pdf

Date:
Speakers:
Yury Polyanskiy
Affiliation:
MIT
    • Portrait of Jeff Running

      Jeff Running

Series: Microsoft Research Talks