In this paper we derive a generalization of the vector entropy power inequality (EPI) recently put forth in [1], which was valid only for diagonal matrices, to the full matrix case. Next, we study the problem of computing the linear precoder that maximizes the mutual information in linear vector Gaussian channels with arbitrary inputs. In particular, we transform the precoder optimization problem into a new form and, capitalizing on the newly unveiled matrix EPI, we show that some particular instances of the optimization problem can be cast in convex form, i.e., we can have an optimality certificate, which, to the best of our knowledge, had never been obtained previously. © 2011 IEEE
International audienceThe matrix version of the entropy-power inequality for real or complex coeffic...
The entropy power inequality (EPI) has a fundamental role in Information Theory, and has deep connec...
The distance that compares the difference between two probability distributions plays a fundamental ...
Abstract—This paper focuses on developing an alternative proof for an extremal entropy inequality, o...
150 p.Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2006.For the optimality of the sep...
Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of th...
The design of the precoder the maximizes the mutual information in linear vector Gaussian channels w...
We prove the following generalization of the Entropy Power Inequality: h(Ax) h(A~x) where h(\Delt...
We study the design optimization of linear precoders for maximizing the mutual information between f...
Exploiting channel state information at the transmitter and receiver to design an optimal linear pre...
In this paper, derivatives of mutual information for a general linear Gaussian vector channel are co...
Abstract. This article surveys application of convex optimization theory to topics in Information Th...
International audienceA framework for deriving Rényi entropy-power inequalities (EPIs) is presented ...
We survey the state of the art for the proof of the quantum Gaussian optimizer conjectures of quantu...
We derive a tight lower bound on equivocation (conditional entropy), or equivalently a tight upper b...
International audienceThe matrix version of the entropy-power inequality for real or complex coeffic...
The entropy power inequality (EPI) has a fundamental role in Information Theory, and has deep connec...
The distance that compares the difference between two probability distributions plays a fundamental ...
Abstract—This paper focuses on developing an alternative proof for an extremal entropy inequality, o...
150 p.Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2006.For the optimality of the sep...
Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of th...
The design of the precoder the maximizes the mutual information in linear vector Gaussian channels w...
We prove the following generalization of the Entropy Power Inequality: h(Ax) h(A~x) where h(\Delt...
We study the design optimization of linear precoders for maximizing the mutual information between f...
Exploiting channel state information at the transmitter and receiver to design an optimal linear pre...
In this paper, derivatives of mutual information for a general linear Gaussian vector channel are co...
Abstract. This article surveys application of convex optimization theory to topics in Information Th...
International audienceA framework for deriving Rényi entropy-power inequalities (EPIs) is presented ...
We survey the state of the art for the proof of the quantum Gaussian optimizer conjectures of quantu...
We derive a tight lower bound on equivocation (conditional entropy), or equivalently a tight upper b...
International audienceThe matrix version of the entropy-power inequality for real or complex coeffic...
The entropy power inequality (EPI) has a fundamental role in Information Theory, and has deep connec...
The distance that compares the difference between two probability distributions plays a fundamental ...