We introduce a class of information measures based on group entropies, allowing us to describe the information-theoretical properties of complex systems. These entropic measures are nonadditive, and are mathematically deduced from a series of natural axioms. In addition, we require extensivity in order to ensure that our information measures are meaningful. The entropic measures proposed are suitably defined for describing universality classes of complex systems, each characterized by a specific state space growth rate function
We study complexity and information and introduce the idea that while complexity is relative to a g...
This review provides a summary of methods originated in (non-equilibrium) statistical mechanics and ...
We shall prove that the celebrated Renyi entropy is the first example of a new family of infinitely ...
We introduce a class of information measures based on group entropies, allowing us to describe the i...
Both entropy and complexity are central concepts for the understanding and development of Informatio...
In this work, we study generalized entropies and information geometry in a group-theoretical framewo...
We propose a unifying picture where the notion of generalized entropy is related to information theo...
Boltzmann introduced in the 1870s a logarithmic measure for the connection between the thermodynamic...
The entropy of Boltzmann-Gibbs, as proved by Shannon and Khinchin, is based on four axioms, where th...
Information plays an important role in our understanding of the physical world. Hence we propose an ...
The entropy of Boltzmann-Gibbs, as proved by Shannon and Khinchin, is based on four axioms, where th...
AbstractWe study complexity and information and introduce the idea that while complexity is relative...
The notion of entropy is ubiquitous both in natural and social sciences. In the last two decades, a ...
In information theory the 4 Shannon-Khinchin (SK) axioms determine Boltzmann Gibbs entropy, S ~ -Sig...
Information entropy is applied to the analysis of time series generated by dynamical systems. Comple...
We study complexity and information and introduce the idea that while complexity is relative to a g...
This review provides a summary of methods originated in (non-equilibrium) statistical mechanics and ...
We shall prove that the celebrated Renyi entropy is the first example of a new family of infinitely ...
We introduce a class of information measures based on group entropies, allowing us to describe the i...
Both entropy and complexity are central concepts for the understanding and development of Informatio...
In this work, we study generalized entropies and information geometry in a group-theoretical framewo...
We propose a unifying picture where the notion of generalized entropy is related to information theo...
Boltzmann introduced in the 1870s a logarithmic measure for the connection between the thermodynamic...
The entropy of Boltzmann-Gibbs, as proved by Shannon and Khinchin, is based on four axioms, where th...
Information plays an important role in our understanding of the physical world. Hence we propose an ...
The entropy of Boltzmann-Gibbs, as proved by Shannon and Khinchin, is based on four axioms, where th...
AbstractWe study complexity and information and introduce the idea that while complexity is relative...
The notion of entropy is ubiquitous both in natural and social sciences. In the last two decades, a ...
In information theory the 4 Shannon-Khinchin (SK) axioms determine Boltzmann Gibbs entropy, S ~ -Sig...
Information entropy is applied to the analysis of time series generated by dynamical systems. Comple...
We study complexity and information and introduce the idea that while complexity is relative to a g...
This review provides a summary of methods originated in (non-equilibrium) statistical mechanics and ...
We shall prove that the celebrated Renyi entropy is the first example of a new family of infinitely ...