We consider the problem of defining a measure of redundant information that quantifies how much common information two or more random variables specify about a target random variable. We discussed desired properties of such a measure, and propose new measures with some desirable properties
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper...
Using Shannon information theory to analyse the contributions from two source variables to a target,...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...
We consider the problem of defining a measure of redundant information that quantifies how much comm...
The problem of how to properly quantify redundant information is an open question that has been the ...
The problem of how to properly quantify redundant information is an open question that has been the ...
One of the most basic characterizations of the relationship between two random variables, X and Y, i...
We propose new measures of shared information, unique information and synergistic information that c...
We propose new measures of shared information, unique information and synergistic information that c...
We introduce an information theoretic measure of statistical structure, called 'binding information'...
One of the most fundamental questions one can ask about a pair of random variables X and Y is the va...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
The introduction of the partial information decomposition generated a flurry of proposals for defini...
What are the distinct ways in which a set of predictor variables can provide information about a tar...
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper...
Using Shannon information theory to analyse the contributions from two source variables to a target,...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...
We consider the problem of defining a measure of redundant information that quantifies how much comm...
The problem of how to properly quantify redundant information is an open question that has been the ...
The problem of how to properly quantify redundant information is an open question that has been the ...
One of the most basic characterizations of the relationship between two random variables, X and Y, i...
We propose new measures of shared information, unique information and synergistic information that c...
We propose new measures of shared information, unique information and synergistic information that c...
We introduce an information theoretic measure of statistical structure, called 'binding information'...
One of the most fundamental questions one can ask about a pair of random variables X and Y is the va...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
The introduction of the partial information decomposition generated a flurry of proposals for defini...
What are the distinct ways in which a set of predictor variables can provide information about a tar...
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper...
Using Shannon information theory to analyse the contributions from two source variables to a target,...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...