Do optimal entropy-constrained quantizers have a finite or infinite number of codewords?

Andras Gyorgy, Tamas Linder, Philip A. Chou, and Bradley J. Betts

Abstract

An entropy-constrained quantizer is optimal if it minimizes

the expected distortion subject to a constraint on the output entropy. In this correspondence, we use the Lagrangian formulation to show

the existence and study the structure of optimal entropy-constrained quantizers

that achieve a point on the lower convex hull of the operational distortion-

rate function. In general,

an optimal entropy-constrained quantizer may have a countably infinite

number of codewords. Our main results show that if the tail of the source

distribution is sufficiently light (resp., heavy) with respect to the distortion

measure, the Lagrangian-optimal entropy-constrained quantizer has a finite

(resp., infinite) number of codewords. In particular, for the squared

error distortion measure, if the tail of the source distribution is lighter than

the tail of a Gaussian distribution, then the Lagrangian-optimal quantizer

has only a finite number of codewords, while if the tail is heavier than that

of the Gaussian, the Lagrangian-optimal quantizer has an infinite number

of codewords.

Details

Publication typeArticle
Published inIEEE Trans. Information Theory
PublisherInstitute of Electrical and Electronics Engineers, Inc.
> Publications > Do optimal entropy-constrained quantizers have a finite or infinite number of codewords?