Shannon's entropy was characterized by many authors by assuming different sets of postulates. One other measure associated with Shannon's entropy is directed divergence or information gain. In this paper, a characterization theorem for the measure directed divergence is given by assuming intuitively reasonable postulates and with the help of functional equations
A new concept of directed-divergence function of type β is introduced in this paper. This concept is...
Several authors have developed characterization theorems for the directed divergence or information ...
This article provides a completion to theories of information based on entropy, resolving a longstan...
Shannon's entropy was characterized by many authors by assuming different sets of postulates. One ot...
In a recent paper, we discussed normalized measures of entropy,13 i.e. measures of entropy all of wh...
Divergence or relative information is a measure of information associated with two probability distr...
Three measures of divergence between vectors in a convex set of an-dimensional real vector space are...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
We provide a unifying axiomatics for R,enyi’s entropy and non-extensive entropy of Tsallis. It is sh...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
We discuss a special class of generalized divergence measures by the use of generator functions. Any...
Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for diverg...
The representation for measures of information which are symmetric, expansible, and have the branchi...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
A new concept of directed-divergence function of type β is introduced in this paper. This concept is...
Several authors have developed characterization theorems for the directed divergence or information ...
This article provides a completion to theories of information based on entropy, resolving a longstan...
Shannon's entropy was characterized by many authors by assuming different sets of postulates. One ot...
In a recent paper, we discussed normalized measures of entropy,13 i.e. measures of entropy all of wh...
Divergence or relative information is a measure of information associated with two probability distr...
Three measures of divergence between vectors in a convex set of an-dimensional real vector space are...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
We provide a unifying axiomatics for R,enyi’s entropy and non-extensive entropy of Tsallis. It is sh...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
We discuss a special class of generalized divergence measures by the use of generator functions. Any...
Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for diverg...
The representation for measures of information which are symmetric, expansible, and have the branchi...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
A new concept of directed-divergence function of type β is introduced in this paper. This concept is...
Several authors have developed characterization theorems for the directed divergence or information ...
This article provides a completion to theories of information based on entropy, resolving a longstan...