We prove the following generalization of the Entropy Power Inequality: h(Ax) h(A~x) where h(\Delta) denotes (joint-) differential-entropy, x = x 1 : : : xn is a random vector with independent components, ~ x = ~ x 1 : : : ~ xn is a Gaussian vector with independent components such that h(~x i ) = h(x i ), i = 1 : : : n, and A is any matrix. This generalization of the entropy-power inequality is applied to show that a non-Gaussian vector with independent components becomes "closer" to Gaussianity after a linear transformation, where the distance to Gaussianity is measured by the information divergence. Another application is a lower bound, greater than zero, for the mutual-information between non overlapping spectral components ...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
International audienceA framework for deriving Rényi entropy-power inequalities (EPIs) is presented ...
International audienceIt is shown that if X is a random variable whose density satisfies a Poincare ...
Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of th...
A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is sh...
Abstract—This paper focuses on developing an alternative proof for an extremal entropy inequality, o...
International audience<p>Yet another simple proof of the entropy power inequality is given, which av...
International audience<p>A nontrivial linear mixture of independent random variables of fixed entrop...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
We show that a proper expression of the uncertainty relation for a pair of canonically-conjugate con...
The conditional entropy power inequality is a fundamental inequality in information theory, stating ...
The conditional entropy power inequality is a fundamental inequality in information theory, stating ...
Abstract—Shannon’s Entropy Power Inequality (EPI) can be viewed as characterizing the minimum differ...
International audienceIn this communication, we describe some interrelations between generalized q-e...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
International audienceA framework for deriving Rényi entropy-power inequalities (EPIs) is presented ...
International audienceIt is shown that if X is a random variable whose density satisfies a Poincare ...
Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of th...
A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is sh...
Abstract—This paper focuses on developing an alternative proof for an extremal entropy inequality, o...
International audience<p>Yet another simple proof of the entropy power inequality is given, which av...
International audience<p>A nontrivial linear mixture of independent random variables of fixed entrop...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
We show that a proper expression of the uncertainty relation for a pair of canonically-conjugate con...
The conditional entropy power inequality is a fundamental inequality in information theory, stating ...
The conditional entropy power inequality is a fundamental inequality in information theory, stating ...
Abstract—Shannon’s Entropy Power Inequality (EPI) can be viewed as characterizing the minimum differ...
International audienceIn this communication, we describe some interrelations between generalized q-e...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
International audienceA framework for deriving Rényi entropy-power inequalities (EPIs) is presented ...
International audienceIt is shown that if X is a random variable whose density satisfies a Poincare ...