Entropy appears in many contexts (thermodynamics, statistical mechanics, information theory, measure-preserving dynamical systems, topological dynamics, etc.) as a measure of different properties (energy that cannot produce work, disorder, uncertainty, randomness, complexity, etc.). In this review, we focus on the so-called generalized entropies, which from a mathematical point of view are nonnegative functions defined on probability distributions that satisfy the first three Shannon–Khinchin axioms: continuity, maximality and expansibility. While these three axioms are expected to be satisfied by all macroscopic physical systems, the fourth axiom (separability or strong additivity) is in general violated by non-ergodic systems with l...
We study the generalized Rényi entropies which were introduced in the physics literature. The proper...
Limit distributions are not limited to uncorrelated variables but can be constructively derived for ...
A probability distribution encodes all the statistics of its corresponding random variable, hence it...
In information theory the 4 Shannon-Khinchin (SK) axioms determine Boltzmann Gibbs entropy, S ~ -Sig...
Motivated by the hope that the thermodynamical framework might be extended to strongly interacting s...
Product probability property, known in the literature as statistical independence, is examined first...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Starting from the geometrical interpretation of the Rényi entropy, we introduce further extensive ge...
In the field of non-extensive statistical mechanics it is common to focus more attention on the fami...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
We present axiomatic characterizations of the Nath, R´enyi and Havrda-Charv´at-Tsallis entropies und...
Modified entropies have been extensively considered by several authors in articles published almost ...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
A consistent generalization of statistical mechanics is obtained by applying the MaxEnt principle t...
We study the generalized Rényi entropies which were introduced in the physics literature. The proper...
Limit distributions are not limited to uncorrelated variables but can be constructively derived for ...
A probability distribution encodes all the statistics of its corresponding random variable, hence it...
In information theory the 4 Shannon-Khinchin (SK) axioms determine Boltzmann Gibbs entropy, S ~ -Sig...
Motivated by the hope that the thermodynamical framework might be extended to strongly interacting s...
Product probability property, known in the literature as statistical independence, is examined first...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Starting from the geometrical interpretation of the Rényi entropy, we introduce further extensive ge...
In the field of non-extensive statistical mechanics it is common to focus more attention on the fami...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
We present axiomatic characterizations of the Nath, R´enyi and Havrda-Charv´at-Tsallis entropies und...
Modified entropies have been extensively considered by several authors in articles published almost ...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
A consistent generalization of statistical mechanics is obtained by applying the MaxEnt principle t...
We study the generalized Rényi entropies which were introduced in the physics literature. The proper...
Limit distributions are not limited to uncorrelated variables but can be constructively derived for ...
A probability distribution encodes all the statistics of its corresponding random variable, hence it...