This report presents the result recently published in [1] that establishes a one-to-one correspondence between information inequalities and group inequalities. Our aim in this report is to present this result in as concise a manner as possible whilst not excluding any steps in the derivation, thus making it suitable for brief perusal. In Sections 2–4, we introduce the notions of entropy functions, group-characterizable functions, information inequalities, and group inequalities. We confine most of the technical details to Section 5 — a section that may we skipped if one is interested only in the general idea of the method in which the result is demonstrated. We complete the demonstration in Section 6 and conclude in Section 7. 2 Entropy fun...
Abstract-The role of inequalities in information theory is reviewed and the relationship of these in...
AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is de...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
En este trabajo se hace un estudio de una relación encontrada en el 2002 por T. H. Chan y R. W. Yeu...
Abstract—Shannon’s Entropy Power Inequality (EPI) can be viewed as characterizing the minimum differ...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
Abstract—Han’s inequality on the entropy rates of subsets of random variables is a classic result in...
An entropic vector is a 2n - 1 dimensional vector collecting all the possible joint entropies of n d...
Abstract—Upper and lower bounds are obtained for the joint entropy of a collection of random variabl...
The objective of this note is to report some potentially useful mutual information inequalities. 1 P...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
In this article, we discuss the problem of establishing relations between information measures for n...
In this article, we discuss the problem of establishing relations between information measures for n...
Entropy inequalities are very important in information theory and they play a crucial role in variou...
Madiman, MokshayLi, WenboThis dissertation explores three topics at the intersection of probability ...
Abstract-The role of inequalities in information theory is reviewed and the relationship of these in...
AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is de...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...
En este trabajo se hace un estudio de una relación encontrada en el 2002 por T. H. Chan y R. W. Yeu...
Abstract—Shannon’s Entropy Power Inequality (EPI) can be viewed as characterizing the minimum differ...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
Abstract—Han’s inequality on the entropy rates of subsets of random variables is a classic result in...
An entropic vector is a 2n - 1 dimensional vector collecting all the possible joint entropies of n d...
Abstract—Upper and lower bounds are obtained for the joint entropy of a collection of random variabl...
The objective of this note is to report some potentially useful mutual information inequalities. 1 P...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
In this article, we discuss the problem of establishing relations between information measures for n...
In this article, we discuss the problem of establishing relations between information measures for n...
Entropy inequalities are very important in information theory and they play a crucial role in variou...
Madiman, MokshayLi, WenboThis dissertation explores three topics at the intersection of probability ...
Abstract-The role of inequalities in information theory is reviewed and the relationship of these in...
AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is de...
AbstractWe establish new lower and upper bounds for Jensen’s discrete inequality. Applying those res...