Shannon's entropy was characterized by many authors by assuming different sets of postulates. One other measure associated with Shannon's entropy is directed divergence or information gain. In this paper, a characterization theorem for the measure directed divergence is given by assuming intuitively reasonable postulates and with the help of functional equations
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
AbstractThis paper concerns an axiomatic characterization of information measures of dimension k. Th...
This book presents new and original research in Statistical Information Theory, based on minimum div...
Shannon's entropy was characterized by many authors by assuming different sets of postulates. One ot...
Several authors have developed characterization theorems for the directed divergence or information ...
The representation for measures of information which are symmetric, expansible, and have the branchi...
The directed divergence of type β which generalizes Kullback's directed divergence or Information me...
Data science, information theory, probability theory, statistical learning and other related discipl...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
In a recent paper, we discussed normalized measures of entropy,13 i.e. measures of entropy all of wh...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
Information and Divergence measures deals with the study of problems concerning information processi...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
AbstractThis paper concerns an axiomatic characterization of information measures of dimension k. Th...
This book presents new and original research in Statistical Information Theory, based on minimum div...
Shannon's entropy was characterized by many authors by assuming different sets of postulates. One ot...
Several authors have developed characterization theorems for the directed divergence or information ...
The representation for measures of information which are symmetric, expansible, and have the branchi...
The directed divergence of type β which generalizes Kullback's directed divergence or Information me...
Data science, information theory, probability theory, statistical learning and other related discipl...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
In a recent paper, we discussed normalized measures of entropy,13 i.e. measures of entropy all of wh...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
Information and Divergence measures deals with the study of problems concerning information processi...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
AbstractThis paper concerns an axiomatic characterization of information measures of dimension k. Th...
This book presents new and original research in Statistical Information Theory, based on minimum div...