Home  /  Entropy  /  Vol: 19 Núm: 6 Par: June (2017)  /  Article
ARTICLE
TITLE

Projection to Mixture Families and Rate-Distortion Bounds with Power Distortion Measures

SUMMARY

The explicit form of the rate-distortion function has rarely been obtained, except for few cases where the Shannon lower bound coincides with the rate-distortion function for the entire range of the positive rate. From an information geometrical point of view, the evaluation of the rate-distortion function is achieved by a projection to the mixture family defined by the distortion measure. In this paper, we consider the ß -th power distortion measure, and prove that ß -generalized Gaussian distribution is the only source that can make the Shannon lower bound tight at the minimum distortion level at zero rate. We demonstrate that the tightness of the Shannon lower bound for ß = 1 (Laplacian source) and ß = 2 (Gaussian source) yields upper bounds to the rate-distortion function of power distortion measures with a different power. These bounds evaluate from above the projection of the source distribution to the mixture family of the generalized Gaussian models. Applying similar arguments to ? -insensitive distortion measures, we consider the tightness of the Shannon lower bound and derive an upper bound to the distortion-rate function which is accurate at low rates.

 Articles related

Chung Chan    

The multiterminal secret key agreement problem by public discussion is formulated with an additional source compression step where, prior to the public discussion phase, users independently compress their private sources to filter out strongly correlated... see more

Revista: Entropy

Steeve Zozor, David Puertas-Centeno and Jesús S. Dehesa    

Information-theoretic inequalities play a fundamental role in numerous scientific and technological areas (e.g., estimation and communication theories, signal and information processing, quantum physics, …) as they generally express the impossibility to ... see more

Revista: Entropy

Artemy Kolchinsky and Brendan D. Tracey    

Mixture distributions arise in many parametric and non-parametric settings—for example, in Gaussian mixture models and in non-parametric estimation. It is often necessary to compute the entropy of a mixture, but, in most cases, this quantity has no close... see more

Revista: Entropy

Frank Hansen, Jin Liang and Guanghua Shi    

We study the convexity or concavity of certain trace functions for the deformed logarithmic and exponential functions, and in this way obtain new trace inequalities for deformed exponentials that may be considered as generalizations of Peierls–Bogolyubov... see more

Revista: Entropy

André Schlichting    

This work studies mixtures of probability measures on R n and gives bounds on the Poincaré and the log–Sobolev constants of two-component mixtures provided that each component satisfies the functional inequality, and both components ... see more

Revista: Entropy