Product probability property, known in the literature as statistical independence, is examined first. Then various generalized entropies are introduced, all of which give gen-eralizations to Shannon entropy. It is shown that the na-ture of the recursivity postulate automatically determines the logarithmic functional form for Shannon entropy. Due to the logarithmic nature, Shannon entropy naturally gives rise to additivity, when applied to situations having prod-uct probability property. It is argued that the natural process is non-additivity leading to non-extensive statisti-cal mechanics, even in product probability property situ-ations and additivity can hold due to the involvement of a recursivity postulate leading to a logarithmic funct...
WOS: 000269425200054The extremization of an appropriate entropic functional may yield to the probabi...
A consistent generalization of statistical mechanics is obtained by applying the MaxEnt principle t...
International audience—We study entropy rates of random sequences for general entropy functionals in...
Entropy appears in many contexts (thermodynamics, statistical mechanics, information theory, measure...
In information theory the 4 Shannon-Khinchin (SK) axioms determine Boltzmann Gibbs entropy, S ~ -Sig...
Motivated by the hope that the thermodynamical framework might be extended to strongly interacting s...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
WOS: 000269425200054The extremization of an appropriate entropic functional may yield to the probabi...
A consistent generalization of statistical mechanics is obtained by applying the MaxEnt principle t...
International audience—We study entropy rates of random sequences for general entropy functionals in...
Entropy appears in many contexts (thermodynamics, statistical mechanics, information theory, measure...
In information theory the 4 Shannon-Khinchin (SK) axioms determine Boltzmann Gibbs entropy, S ~ -Sig...
Motivated by the hope that the thermodynamical framework might be extended to strongly interacting s...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
WOS: 000269425200054The extremization of an appropriate entropic functional may yield to the probabi...
A consistent generalization of statistical mechanics is obtained by applying the MaxEnt principle t...
International audience—We study entropy rates of random sequences for general entropy functionals in...