Home  /  Entropy  /  Vol: 19 Núm: 9 Par: Septemb (2017)  /  Article
ARTICLE
TITLE

On Generalized Stam Inequalities and Fisher–Rényi Complexity Measures

SUMMARY

Information-theoretic inequalities play a fundamental role in numerous scientific and technological areas (e.g., estimation and communication theories, signal and information processing, quantum physics, …) as they generally express the impossibility to have a complete description of a system via a finite number of information measures. In particular, they gave rise to the design of various quantifiers (statistical complexity measures) of the internal complexity of a (quantum) system. In this paper, we introduce a three-parametric Fisher–Rényi complexity, named ( p , ß , ? ) -Fisher–Rényi complexity, based on both a two-parametic extension of the Fisher information and the Rényi entropies of a probability density function ? characteristic of the system. This complexity measure quantifies the combined balance of the spreading and the gradient contents of ? , and has the three main properties of a statistical complexity: the invariance under translation and scaling transformations, and a universal bounding from below. The latter is proved by generalizing the Stam inequality, which lowerbounds the product of the Shannon entropy power and the Fisher information of a probability density function. An extension of this inequality was already proposed by Bercher and Lutwak, a particular case of the general one, where the three parameters are linked, allowing to determine the sharp lower bound and the associated probability density with minimal complexity. Using the notion of differential-escort deformation, we are able to determine the sharp bound of the complexity measure even when the three parameters are decoupled (in a certain range). We determine as well the distribution that saturates the inequality: the ( p , ß , ? ) -Gaussian distribution, which involves an inverse incomplete beta function. Finally, the complexity measure is calculated for various quantum-mechanical states of the harmonic and hydrogenic systems, which are the two main prototypes of physical systems subject to a central potential.

 Articles related

Nicholas V. Sarlis    

Natural time is a new time domain introduced in 2001. The analysis of time series associated with a complex system in natural time may provide useful information and may reveal properties that are usually hidden when studying the system in conventional t... see more

Revista: Entropy

Masatoshi Funabashi    

Recently emerging data-driven citizen sciences need to harness an increasing amount of massive data with varying quality. This paper develops essential theoretical frameworks, example models, and a general definition of complexity measure, and examines i... see more

Revista: Entropy

Carmina Coronel, Heinrich Garn, Markus Waser, Manfred Deistler, Thomas Benke, Peter Dal-Bianco, Gerhard Ransmayr, Stephan Seiler, Dieter Grossegger and Reinhold Schmidt    

Analysis of nonlinear quantitative EEG (qEEG) markers describing complexity of signal in relation to severity of Alzheimer’s disease (AD) was the focal point of this study. In this study, 79 patients diagnosed with probable AD were recruited from the mul... see more

Revista: Entropy

Miguel Melgarejo and Nelson Obregon    

Information production in both space and time has been highlighted as one of the elements that shapes the footprint of complexity in natural and socio-technical systems. However, information production in urban crime has barely been studied. This work co... see more

Revista: Entropy

Xiao Zhang, Xia Liu and Yanyan Yang    

The information entropy developed by Shannon is an effective measure of uncertainty in data, and the rough set theory is a useful tool of computer applications to deal with vagueness and uncertainty data circumstances. At present, the information entropy... see more

Revista: Entropy