We study the problem of estimating the overall mutual information in M independent parallel discrete-time memory-less Gaussian channels from N independent data sample pairs per channel (inputs and outputs). We focus on the case where the number of active channels L is sparse in comparison with the total number of channels (L ≪ M), for which the direct application of the maximum likelihood principle is problematic due to overfitting, especially for moderate to small N. For this regime, we show that the bias of the mutual information estimate is reduced by resorting to the minimum description length (MDL) principle. As a result, simple pre-processing based on a per-channel threshold on the empirical squared correlation coefficient is required...
A relationship between information theory and estimation theory was recently shown for the Gaussian ...
Abstract — In the early years of information theory, mutual information was defined as a random vari...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the ...
We study the problem of estimating the overall mutual information in M independent parallel discrete...
We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projec...
The scalar additive Gaussian noise channel has the “single crossing point ” property between the min...
The mutual information of a discrete time Rayleigh fading channel is considered, where neither the t...
Print\ud Request Permissions\ud We consider distributed estimation of a source in additive Gaussian ...
Relations between estimation and information measures have received considerable attention from the ...
Abstract — Consider arbitrarily distributed input signals observed in additive Gaussian noise. A new...
Information theoretic criteria (ITC) have been widely adopted in engineering and statistics for sele...
The mutual information of independent parallel Gaussian-noise channels is maximized, under an averag...
Model selection problems appear frequently in a wide array of applicative domains such as data compr...
The mutual information of independent parallel Gaussian-noise channels is maximized, under an averag...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
A relationship between information theory and estimation theory was recently shown for the Gaussian ...
Abstract — In the early years of information theory, mutual information was defined as a random vari...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the ...
We study the problem of estimating the overall mutual information in M independent parallel discrete...
We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projec...
The scalar additive Gaussian noise channel has the “single crossing point ” property between the min...
The mutual information of a discrete time Rayleigh fading channel is considered, where neither the t...
Print\ud Request Permissions\ud We consider distributed estimation of a source in additive Gaussian ...
Relations between estimation and information measures have received considerable attention from the ...
Abstract — Consider arbitrarily distributed input signals observed in additive Gaussian noise. A new...
Information theoretic criteria (ITC) have been widely adopted in engineering and statistics for sele...
The mutual information of independent parallel Gaussian-noise channels is maximized, under an averag...
Model selection problems appear frequently in a wide array of applicative domains such as data compr...
The mutual information of independent parallel Gaussian-noise channels is maximized, under an averag...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
A relationship between information theory and estimation theory was recently shown for the Gaussian ...
Abstract — In the early years of information theory, mutual information was defined as a random vari...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the ...