[[abstract]]Inequalities for the Kullback-Leibler and X[feb4]-distances and applications for Shannon's entropy and mutual information are given
In this paper we discuss new inequalities for logarithmic mapping and apply them in Information Theo...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
ReportThis report contains a list of some of the more prominent properties and theorems concerning t...
[[abstract]]Inequalities for the Kullback-Leibler and X[feb4]-distances and applications for Shannon...
Inequalities for the Kullback-Leibler and x²−distances and applications for Shannon’s entropy and mu...
New estimates of the Kullback-Leibler distance and applications for Shannon’s entropy and mutual inf...
Using the concavity property of the log mapping and the weighted arithmetic mean - geometric mean in...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
The classical Csiszar-Kullback inequality bounds the L^{1}-distance of two probability densities in ...
In this paper we obtain another upper bound for the Kullback-Leibler distance than the bound obtaine...
The Kullback{Leibler information number, I(P||Q), determined for two probability measures defined on...
In this paper we obtain another upper bound for the Kullback-Leibler distance than the bound obtaine...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
We introduce two new information theoretic measures of distances among probability distributions and...
In this note, using some refinements of Jensen’s discrete inequality,\ud we give some new refinement...
In this paper we discuss new inequalities for logarithmic mapping and apply them in Information Theo...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
ReportThis report contains a list of some of the more prominent properties and theorems concerning t...
[[abstract]]Inequalities for the Kullback-Leibler and X[feb4]-distances and applications for Shannon...
Inequalities for the Kullback-Leibler and x²−distances and applications for Shannon’s entropy and mu...
New estimates of the Kullback-Leibler distance and applications for Shannon’s entropy and mutual inf...
Using the concavity property of the log mapping and the weighted arithmetic mean - geometric mean in...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
The classical Csiszar-Kullback inequality bounds the L^{1}-distance of two probability densities in ...
In this paper we obtain another upper bound for the Kullback-Leibler distance than the bound obtaine...
The Kullback{Leibler information number, I(P||Q), determined for two probability measures defined on...
In this paper we obtain another upper bound for the Kullback-Leibler distance than the bound obtaine...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
We introduce two new information theoretic measures of distances among probability distributions and...
In this note, using some refinements of Jensen’s discrete inequality,\ud we give some new refinement...
In this paper we discuss new inequalities for logarithmic mapping and apply them in Information Theo...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
ReportThis report contains a list of some of the more prominent properties and theorems concerning t...