The Maxwell-Boltzmann distribution is compatible with Shannon’s entropy which in turn is equivalent to thermodynamic entropy. In such a case, ln(P(e)) is information which matches the information e (energy) brought by a particle into a collision. In (1) we considered “altered” probabilities Palt(e) = P(e)/[1-/+P(e)] to describe information associated with a collision of fermions or bosons. In this note we try to reinterpret Palt(e) for fermions or bosons as two pieces of information which combine to create the information e brought into a collision. In principle there could be more terms. It is noted that these terms are linear in P(e). They lead to a sum of Shannon-like entropies i.e. P(e)ln[P(e)] + [1-P(e)] ln[1-P(e)]. We als...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
In the statistics of the Maxwell-Boltzmann distribution, one makes use of the idea of elastic collis...
In classical statistical mechanics there exists the idea of maximum entropy subject to constraints w...
Entropy in the Maxwell-Boltzmann example of a gas with no potential may be mapped into a set of tria...
Maxwell-Boltzmann (MB) distributions and even expressions of Shannon’s entropy emerged in the 1800s ...
Statistical mechanics often focuses on entropy which is related to maximizing the number of possible...
Addition: Reference (1) is: Ran, G. and Du, J. Are power law distributions an equilibrium distrib...
Information theory, which applies to compressing messages, suggests that ln(probability) is informat...
In a previous note (1) we considered dE (thermal)= Sum over i ei dP(i) where P(i) is the probabilit...
One may obtain equilibrium particle number distributions in statistical mechanics by applying time r...
One may maximize Shannon’s entropy -Sum over i f(ei) ln(f(ei)) subject to the constraint Sum over i ...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
In (1) it is suggested that Shannon’s entropy - Sum over i P(i) ln(P(i)) be thought of as the averag...
We suggest that the condition p(ei)p(ej) = p(ei+ej) ((1)) is the underlying idea of the Maxwell-Bolt...
In the literature, it is suggested that one can maximize Shannon´s entropy -Sum on i Pi ln(Pi) subj...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
In the statistics of the Maxwell-Boltzmann distribution, one makes use of the idea of elastic collis...
In classical statistical mechanics there exists the idea of maximum entropy subject to constraints w...
Entropy in the Maxwell-Boltzmann example of a gas with no potential may be mapped into a set of tria...
Maxwell-Boltzmann (MB) distributions and even expressions of Shannon’s entropy emerged in the 1800s ...
Statistical mechanics often focuses on entropy which is related to maximizing the number of possible...
Addition: Reference (1) is: Ran, G. and Du, J. Are power law distributions an equilibrium distrib...
Information theory, which applies to compressing messages, suggests that ln(probability) is informat...
In a previous note (1) we considered dE (thermal)= Sum over i ei dP(i) where P(i) is the probabilit...
One may obtain equilibrium particle number distributions in statistical mechanics by applying time r...
One may maximize Shannon’s entropy -Sum over i f(ei) ln(f(ei)) subject to the constraint Sum over i ...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
In (1) it is suggested that Shannon’s entropy - Sum over i P(i) ln(P(i)) be thought of as the averag...
We suggest that the condition p(ei)p(ej) = p(ei+ej) ((1)) is the underlying idea of the Maxwell-Bolt...
In the literature, it is suggested that one can maximize Shannon´s entropy -Sum on i Pi ln(Pi) subj...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
In the statistics of the Maxwell-Boltzmann distribution, one makes use of the idea of elastic collis...
In classical statistical mechanics there exists the idea of maximum entropy subject to constraints w...