One of the most basic characterizations of the relationship between two random variables, X and Y, is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y) can be replaced by its minimal sufficient statistic about Y (or X) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X's minimal suff...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
We present two classes of improved estimators for mutual information M(X,Y), from samples of random ...
MOTIVATION: Mutual information (MI) is a quantity that measures the dependence between two arbitrary...
One of the most basic characterizations of the relationship between two random variables, X and Y, i...
One of the most fundamental questions one can ask about a pair of random variables X and Y is the va...
We derive a tight lower bound on equivocation (conditional entropy), or equivalently a tight upper b...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...
The problem of how to properly quantify redundant information is an open question that has been the ...
The problem of how to properly quantify redundant information is an open question that has been the ...
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper...
Abstract. One of the main notions of information theory is the notion of mutual information in two m...
We consider the problem of defining a measure of redundant information that quantifies how much comm...
Abstract—Many of the classical and recent relations between in-formation and estimation in the prese...
Abstract—Many of the classical and recent relations between information and estimation in the presen...
Abstract—Identities yielding optimal estimation interpretations for mutual information and relative ...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
We present two classes of improved estimators for mutual information M(X,Y), from samples of random ...
MOTIVATION: Mutual information (MI) is a quantity that measures the dependence between two arbitrary...
One of the most basic characterizations of the relationship between two random variables, X and Y, i...
One of the most fundamental questions one can ask about a pair of random variables X and Y is the va...
We derive a tight lower bound on equivocation (conditional entropy), or equivalently a tight upper b...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...
The problem of how to properly quantify redundant information is an open question that has been the ...
The problem of how to properly quantify redundant information is an open question that has been the ...
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper...
Abstract. One of the main notions of information theory is the notion of mutual information in two m...
We consider the problem of defining a measure of redundant information that quantifies how much comm...
Abstract—Many of the classical and recent relations between in-formation and estimation in the prese...
Abstract—Many of the classical and recent relations between information and estimation in the presen...
Abstract—Identities yielding optimal estimation interpretations for mutual information and relative ...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
We present two classes of improved estimators for mutual information M(X,Y), from samples of random ...
MOTIVATION: Mutual information (MI) is a quantity that measures the dependence between two arbitrary...