We deploy Shannon's information entropy to the distribution of branching fractions in a particle decay. This serves to quantify how important a given new reported decay channel is, from the point of view of the information that it adds to the already known ones. Because the entropy is additive, one can subdivide the set of channels and discuss, for example, how much information the discovery of a new decay branching would add; or subdivide the decay distribution down to the level of individual quantum states (which can be quickly counted by the phase space). We illustrate the concept with some examples of experimentally known particle decay distributions
The information content of a source is defined in terms of the minimum number of bits needed to stor...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
Entropy is a quantity characterizing the arrow of time in the evolution of a physical system - in ev...
Currently, ‘time’ does not play any essential role in quantum information theory. In this sense, qua...
Three measures of the information content of a probability distribution are briefly reviewed. They a...
In (1) it is suggested that Shannon’s entropy - Sum over i P(i) ln(P(i)) be thought of as the averag...
One may obtain equilibrium particle number distributions in statistical mechanics by applying time r...
We adopt the channel capacity formalism from quantum infor-mation theory and suggest that it is fund...
In the literature (e.g. (1)), the expression - density(x) ln(density(x)) is used as Shannon’s spatia...
The flow of information in high-energy collisions has been recently investigated by various groups: ...
This paper explores the idea of information loss through data compression, as occurs in the course o...
Currently, 'time' does not play any essential role in quantum information theory. In this sense, qua...
Shannon’s entropy is calculated using probabilities P(i) i.e. S= - Sum over i P(i) ln(P(i)). A proba...
We study how the Shannon entropy of sequences produced by an information source converges to the sou...
latex InfoStatPhys-unix.tex, 3 files, 2 figures, 32 pages http://www-spht.cea.fr/articles/T04/185Int...
The information content of a source is defined in terms of the minimum number of bits needed to stor...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
Entropy is a quantity characterizing the arrow of time in the evolution of a physical system - in ev...
Currently, ‘time’ does not play any essential role in quantum information theory. In this sense, qua...
Three measures of the information content of a probability distribution are briefly reviewed. They a...
In (1) it is suggested that Shannon’s entropy - Sum over i P(i) ln(P(i)) be thought of as the averag...
One may obtain equilibrium particle number distributions in statistical mechanics by applying time r...
We adopt the channel capacity formalism from quantum infor-mation theory and suggest that it is fund...
In the literature (e.g. (1)), the expression - density(x) ln(density(x)) is used as Shannon’s spatia...
The flow of information in high-energy collisions has been recently investigated by various groups: ...
This paper explores the idea of information loss through data compression, as occurs in the course o...
Currently, 'time' does not play any essential role in quantum information theory. In this sense, qua...
Shannon’s entropy is calculated using probabilities P(i) i.e. S= - Sum over i P(i) ln(P(i)). A proba...
We study how the Shannon entropy of sequences produced by an information source converges to the sou...
latex InfoStatPhys-unix.tex, 3 files, 2 figures, 32 pages http://www-spht.cea.fr/articles/T04/185Int...
The information content of a source is defined in terms of the minimum number of bits needed to stor...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
Entropy is a quantity characterizing the arrow of time in the evolution of a physical system - in ev...