A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random variable and Y = T(X), where T is a piecewise monotonic function, the entropy difference H [X] — H [Y] is first characterized as the expectation of a function with reasonable properties. After entropy differences are characterized, zero entropy is assigned to the random variable which is uniformly distributed on the unit interval. In the course of the derivation, a characterization of Kullback's directed divergence is also obtained
The representation for measures of information which are symmetric, expansible, and have the branchi...
AbstractThe statistical entropy is shown to increase due to information loss introduced by a substit...
AbstractThe paper is devoted to metrization of probability spaces through the introduction of a quad...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
A functional defined by means of entropy is considered. It is shown that it is a distance in the set...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
Interest in the informational content of truncation motivates the study of the residual entropy func...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
The present study on the characterization of probability distributions using the residual entropy fu...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
Entropy rate of discrete random sources are a real valued functional on the space of probability mea...
The relationship between three probability distributions and their maximizable entropy forms is disc...
We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. W...
The representation for measures of information which are symmetric, expansible, and have the branchi...
AbstractThe statistical entropy is shown to increase due to information loss introduced by a substit...
AbstractThe paper is devoted to metrization of probability spaces through the introduction of a quad...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
A functional defined by means of entropy is considered. It is shown that it is a distance in the set...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
Interest in the informational content of truncation motivates the study of the residual entropy func...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
The present study on the characterization of probability distributions using the residual entropy fu...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
Entropy rate of discrete random sources are a real valued functional on the space of probability mea...
The relationship between three probability distributions and their maximizable entropy forms is disc...
We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. W...
The representation for measures of information which are symmetric, expansible, and have the branchi...
AbstractThe statistical entropy is shown to increase due to information loss introduced by a substit...
AbstractThe paper is devoted to metrization of probability spaces through the introduction of a quad...