A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random variable and Y = T(X), where T is a piecewise monotonic function, the entropy difference H [X] — H [Y] is first characterized as the expectation of a function with reasonable properties. After entropy differences are characterized, zero entropy is assigned to the random variable which is uniformly distributed on the unit interval. In the course of the derivation, a characterization of Kullback's directed divergence is also obtained
The relationship between three probability distributions and their maximizable entropy forms is disc...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
This article provides a completion to theories of information based on entropy, resolving a longstan...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
The present study on the characterization of probability distributions using the residual entropy fu...
Interest in the informational content of truncation motivates the study of the residual entropy func...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
The refinement axiom for entropy has been provocative in providing foundations of information theory...
We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. W...
Shannon's entropy was characterized by many authors by assuming different sets of postulates. One ot...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
The relationship between three probability distributions and their maximizable entropy forms is disc...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
This article provides a completion to theories of information based on entropy, resolving a longstan...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
The present study on the characterization of probability distributions using the residual entropy fu...
Interest in the informational content of truncation motivates the study of the residual entropy func...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
The refinement axiom for entropy has been provocative in providing foundations of information theory...
We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. W...
Shannon's entropy was characterized by many authors by assuming different sets of postulates. One ot...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
The relationship between three probability distributions and their maximizable entropy forms is disc...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
This article provides a completion to theories of information based on entropy, resolving a longstan...