Limit distributions are not limited to uncorrelated variables but can be constructively derived for a large class of correlated random variables, as was shown e.g. in the context of large deviation theory [1], and recently in a very general setting by Hilhorst and Schehr [2]. At the same time it has been conjectured, based on numerical evidence, that several limit distributions originating from specific correlated random processes follow q-Gaussians. It could be shown that this is not the case for some of these situations, and more complicated limit distributions are necessary. In this work we show the derivation of the analytical form of entropy which -under the maximum entropy principle, imposing ordinary constraints- provides exactly the...
As shown by Jaynes, the canonical and grand canonical probability distributions of equilibrium stati...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Entropy appears in many contexts (thermodynamics, statistical mechanics, information theory, measure...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
electronic version (6 pp.)International audienceDistributions derived from the maximization of Rényi...
electronic version (6 pp.)International audienceDistributions derived from the maximization of Rényi...
A distribution that maximizes an entropy can be found by applying two different principles. On the o...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
As shown by Jaynes, the canonical and grand canonical probability distributions of equilibrium stati...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Entropy appears in many contexts (thermodynamics, statistical mechanics, information theory, measure...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
electronic version (6 pp.)International audienceDistributions derived from the maximization of Rényi...
electronic version (6 pp.)International audienceDistributions derived from the maximization of Rényi...
A distribution that maximizes an entropy can be found by applying two different principles. On the o...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
As shown by Jaynes, the canonical and grand canonical probability distributions of equilibrium stati...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Entropy appears in many contexts (thermodynamics, statistical mechanics, information theory, measure...