A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language
Mutual information is calculated for processes described by stochastic differential equations. The e...
Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of th...
Abstract—Identities yielding optimal estimation interpretations for mutual information and relative ...
In this paper, derivatives of mutual information for a general linear Gaussian vector channel are co...
This paper considers a general linear vector Gaussian channel with arbitrary signaling and pursues t...
Relations between estimation and information measures have received considerable attention from the ...
Abstract — Consider arbitrarily distributed input signals observed in additive Gaussian noise. A new...
We derive a tight lower bound on equivocation (conditional entropy), or equivalently a tight upper b...
Abstract—Many of the classical and recent relations between in-formation and estimation in the prese...
Abstract—Many of the classical and recent relations between information and estimation in the presen...
Abstract—In addition to exploring its various regularity prop-erties, we show that the minimum mean-...
Abstract — In the early years of information theory, mutual information was defined as a random vari...
Mutual information is calculated for processes described by stochastic differential equations. The e...
Abstract We discuss some of the recent literature on relations between information and estimation th...
AbstractIn the Gaussian channel Y(t) = Φ(t) + X(t) = message + noise, where Φ(t) and X(t) are mutual...
Mutual information is calculated for processes described by stochastic differential equations. The e...
Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of th...
Abstract—Identities yielding optimal estimation interpretations for mutual information and relative ...
In this paper, derivatives of mutual information for a general linear Gaussian vector channel are co...
This paper considers a general linear vector Gaussian channel with arbitrary signaling and pursues t...
Relations between estimation and information measures have received considerable attention from the ...
Abstract — Consider arbitrarily distributed input signals observed in additive Gaussian noise. A new...
We derive a tight lower bound on equivocation (conditional entropy), or equivalently a tight upper b...
Abstract—Many of the classical and recent relations between in-formation and estimation in the prese...
Abstract—Many of the classical and recent relations between information and estimation in the presen...
Abstract—In addition to exploring its various regularity prop-erties, we show that the minimum mean-...
Abstract — In the early years of information theory, mutual information was defined as a random vari...
Mutual information is calculated for processes described by stochastic differential equations. The e...
Abstract We discuss some of the recent literature on relations between information and estimation th...
AbstractIn the Gaussian channel Y(t) = Φ(t) + X(t) = message + noise, where Φ(t) and X(t) are mutual...
Mutual information is calculated for processes described by stochastic differential equations. The e...
Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of th...
Abstract—Identities yielding optimal estimation interpretations for mutual information and relative ...