Do optimal entropy-constrained quantizers have a finite or infinite number of codewords?

Andras Gyorgy, Tamas Linder, Philip A. Chou, and Bradley J. Betts

November 2003

An entropy-constrained quantizer is optimal if it minimizes the expected distortion subject to a constraint on the output entropy. In this correspondence, we use the Lagrangian formulation to show the existence and study the structure of optimal entropy-constrained quantizers that achieve a point on the lower convex hull of the operational distortion- rate function. In general, an optimal entropy-constrained quantizer may have a countably infinite number of codewords. Our main results show that if the tail of the source distribution is sufficiently light (resp., heavy) with respect to the distortion measure, the Lagrangian-optimal entropy-constrained quantizer has a finite (resp., infinite) number of codewords. In particular, for the squared error distortion measure, if the tail of the source distribution is lighter than the tail of a Gaussian distribution, then the Lagrangian-optimal quantizer has only a finite number of codewords, while if the tail is heavier than that of the Gaussian, the Lagrangian-optimal quantizer has an infinite number of codewords.

Publication type | Article |

Published in | IEEE Trans. Information Theory |

Publisher | Institute of Electrical and Electronics Engineers, Inc. © 2003 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. |

- Conditional entropy-constrained vector quantization of linear predictive coefficients
- Distributing Streaming Media Content using Cooperative Networking
- Document image decoding using Markov sources

> Publications > Do optimal entropy-constrained quantizers have a finite or infinite number of codewords?