Inequalities for the Kullback-Leibler and x²−distances and applications for Shannon’s entropy and mutual information are given
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
Abstract—Upper and lower bounds are obtained for the joint entropy of a collection of random variabl...
The Kullback{Leibler information number, I(P||Q), determined for two probability measures defined on...
Inequalities for the Kullback-Leibler and x²−distances and applications for Shannon’s entropy and mu...
[[abstract]]Inequalities for the Kullback-Leibler and X[feb4]-distances and applications for Shannon...
New estimates of the Kullback-Leibler distance and applications for Shannon’s entropy and mutual inf...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
In this note, using some refinements of Jensen’s discrete inequality,\ud we give some new refinement...
Using the concavity property of the log mapping and the weighted arithmetic mean - geometric mean in...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
AbstractIn this paper, we derive some upper bounds for the relative entropy D(p ‖ q) of two probabil...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
In this paper we derive some upper bounds for the relative entropy D(p || q) of two probability dist...
The refinement axiom for entropy has been provocative in providing foundations of information theory...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
Abstract—Upper and lower bounds are obtained for the joint entropy of a collection of random variabl...
The Kullback{Leibler information number, I(P||Q), determined for two probability measures defined on...
Inequalities for the Kullback-Leibler and x²−distances and applications for Shannon’s entropy and mu...
[[abstract]]Inequalities for the Kullback-Leibler and X[feb4]-distances and applications for Shannon...
New estimates of the Kullback-Leibler distance and applications for Shannon’s entropy and mutual inf...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
In this note, using some refinements of Jensen’s discrete inequality,\ud we give some new refinement...
Using the concavity property of the log mapping and the weighted arithmetic mean - geometric mean in...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
AbstractIn this paper, we derive some upper bounds for the relative entropy D(p ‖ q) of two probabil...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
In this paper we derive some upper bounds for the relative entropy D(p || q) of two probability dist...
The refinement axiom for entropy has been provocative in providing foundations of information theory...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
Abstract—Upper and lower bounds are obtained for the joint entropy of a collection of random variabl...
The Kullback{Leibler information number, I(P||Q), determined for two probability measures defined on...