AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is derived by the use of Jensen Inequality. In this paper, we extend the Jensen Inequality and apply it to derive some useful lower bounds for various entropy measures of discrete random variables
Continuous probability density functions and discrete probability mass functions are tabulated which...
Using Jensen’s inequality and the converse Jensen’s inequality for superquadratic functions we obtai...
This paper is part of a general study of efficient information selection, storage and processing. It...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
The Jensen inequality is one of the most important inequalities in theory of inequalities, and numer...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
AbstractUsing an inequality for convex functions by Andrica and Ra°a [1] (2.1), we point out a new i...
The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
Jensen’s inequality is one of the fundamental inequalities which has several applications in almost ...
International audienceIn 1994, Jim Massey proposed the guessing entropy as a measure of the difficul...
This report presents the result recently published in [1] that establishes a one-to-one corresponden...
Continuous probability density functions and discrete probability mass functions are tabulated which...
Using Jensen’s inequality and the converse Jensen’s inequality for superquadratic functions we obtai...
This paper is part of a general study of efficient information selection, storage and processing. It...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
The Jensen inequality is one of the most important inequalities in theory of inequalities, and numer...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
AbstractUsing an inequality for convex functions by Andrica and Ra°a [1] (2.1), we point out a new i...
The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
Jensen’s inequality is one of the fundamental inequalities which has several applications in almost ...
International audienceIn 1994, Jim Massey proposed the guessing entropy as a measure of the difficul...
This report presents the result recently published in [1] that establishes a one-to-one corresponden...
Continuous probability density functions and discrete probability mass functions are tabulated which...
Using Jensen’s inequality and the converse Jensen’s inequality for superquadratic functions we obtai...
This paper is part of a general study of efficient information selection, storage and processing. It...