The classical Csiszar-Kullback inequality bounds the L^{1}-distance of two probability densities in term of their relative (convex) entropies. Here we generalize such inequalities to not necessarily normalized and possibly non-positive L^{1} functions. Also, our generalized Csiszar-Kullback inequalities are in many important cases sharper than the classical ones (in terms of the functional dependence of the L^{1} bound on the relative entropy). Moreover our construction of these bounds is rather elementary
We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. W...
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recu...
To procure inequalities for divergences between probability distributions, Jensen's inequality is th...
[[abstract]]Inequalities for the Kullback-Leibler and X[feb4]-distances and applications for Shannon...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
AbstractIn this paper, we derive some upper bounds for the relative entropy D(p ‖ q) of two probabil...
In the first part of this paper, we show the entropy measures defined from the distance, in the sens...
This paper extends some geometric properties of a one-parameter family of relative entropies. These ...
In this paper we derive some upper bounds for the relative entropy D(p || q) of two probability dist...
Inequalities for the Kullback-Leibler and x²−distances and applications for Shannon’s entropy and mu...
New estimates of the Kullback-Leibler distance and applications for Shannon’s entropy and mutual inf...
Using the concavity property of the log mapping and the weighted arithmetic mean - geometric mean in...
We show how to determine the maximum and minimum possible values of one measure of entropy for a giv...
summary:Generalized entropic functionals are in an active area of research. Hence lower and upper bo...
Pinsker's inequality states that the relative entropy between two random variables X and Y dominates...
We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. W...
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recu...
To procure inequalities for divergences between probability distributions, Jensen's inequality is th...
[[abstract]]Inequalities for the Kullback-Leibler and X[feb4]-distances and applications for Shannon...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
AbstractIn this paper, we derive some upper bounds for the relative entropy D(p ‖ q) of two probabil...
In the first part of this paper, we show the entropy measures defined from the distance, in the sens...
This paper extends some geometric properties of a one-parameter family of relative entropies. These ...
In this paper we derive some upper bounds for the relative entropy D(p || q) of two probability dist...
Inequalities for the Kullback-Leibler and x²−distances and applications for Shannon’s entropy and mu...
New estimates of the Kullback-Leibler distance and applications for Shannon’s entropy and mutual inf...
Using the concavity property of the log mapping and the weighted arithmetic mean - geometric mean in...
We show how to determine the maximum and minimum possible values of one measure of entropy for a giv...
summary:Generalized entropic functionals are in an active area of research. Hence lower and upper bo...
Pinsker's inequality states that the relative entropy between two random variables X and Y dominates...
We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. W...
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recu...
To procure inequalities for divergences between probability distributions, Jensen's inequality is th...