AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables play important roles in information theory (see Ash (1965) [8] and Cover and Thomas (2006) [9]). Our purpose within this work is to present a strong upper bound for the classical Shannon entropy, refining recent results from the literature. For this purpose we have considered the work of Simic (2009) [4], where new entropy bounds based on a new refinement of Jensen’s inequality are presented. Our work improves the basic result of Simic through a stronger refinement of Jensen’s inequality which is then applied to information theory
The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper...
Abstract—This paper focuses on developing an alternative proof for an extremal entropy inequality, o...
The Jensen inequality is one of the most important inequalities in theory of inequalities, and numer...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is de...
Using Jensen’s inequality and the converse Jensen’s inequality for superquadratic functions we obtai...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
Abstract—Han’s inequality on the entropy rates of subsets of random variables is a classic result in...
Abstract—Upper and lower bounds are obtained for the joint entropy of a collection of random variabl...
AbstractIn this paper, we derive some upper bounds for the relative entropy D(p ‖ q) of two probabil...
International audienceIn 1994, Jim Massey proposed the guessing entropy as a measure of the difficul...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
A set of Fisher information properties are presented in order to draw a parallel with similar proper...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper...
Abstract—This paper focuses on developing an alternative proof for an extremal entropy inequality, o...
The Jensen inequality is one of the most important inequalities in theory of inequalities, and numer...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is de...
Using Jensen’s inequality and the converse Jensen’s inequality for superquadratic functions we obtai...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
Abstract—Han’s inequality on the entropy rates of subsets of random variables is a classic result in...
Abstract—Upper and lower bounds are obtained for the joint entropy of a collection of random variabl...
AbstractIn this paper, we derive some upper bounds for the relative entropy D(p ‖ q) of two probabil...
International audienceIn 1994, Jim Massey proposed the guessing entropy as a measure of the difficul...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
A set of Fisher information properties are presented in order to draw a parallel with similar proper...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper...
Abstract—This paper focuses on developing an alternative proof for an extremal entropy inequality, o...
The Jensen inequality is one of the most important inequalities in theory of inequalities, and numer...