We propose new measures of shared information, unique information and synergistic information that can be used to decompose the multi-information of a pair of random variables (Y, Z) with a third random variable X. Our measures are motivated by an operational idea of unique information which suggests that shared information and unique information should depend only on the pair marginal distributions of (X, Y) and (X,Z). Although this invari-ance property has not been studied before, it is satisfied by other proposed measures of shared information. The invariance property does not uniquely determine our new measures, but it implies that the functions that we define are bounds to any other measures satisfying the same invariance property. We ...
The problem of how to properly quantify redundant information is an open question that has been the ...
What are the distinct ways in which a set of predictor variables can provide information about a tar...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
We propose new measures of shared information, unique information and synergistic information that c...
We propose new measures of shared information, unique information and synergistic information that c...
We consider the problem of quantifying the information shared by a pair of random variables X 1...
Recently, a series of papers addressed the problem of decomposing the information of two random vari...
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper...
The partial information decomposition (PID) is perhaps the leading proposal for resolving informatio...
The problem of how to properly quantify redundant information is an open question that has been the ...
The interactions between three or more random variables are often nontrivial, poorly understood and,...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
Abstract. One of the main notions of information theory is the notion of mutual information in two m...
Abstract—This paper considers the problem of defining a measure of redundant information that quanti...
The problem of how to properly quantify redundant information is an open question that has been the ...
What are the distinct ways in which a set of predictor variables can provide information about a tar...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
We propose new measures of shared information, unique information and synergistic information that c...
We propose new measures of shared information, unique information and synergistic information that c...
We consider the problem of quantifying the information shared by a pair of random variables X 1...
Recently, a series of papers addressed the problem of decomposing the information of two random vari...
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper...
The partial information decomposition (PID) is perhaps the leading proposal for resolving informatio...
The problem of how to properly quantify redundant information is an open question that has been the ...
The interactions between three or more random variables are often nontrivial, poorly understood and,...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...
Abstract. One of the main notions of information theory is the notion of mutual information in two m...
Abstract—This paper considers the problem of defining a measure of redundant information that quanti...
The problem of how to properly quantify redundant information is an open question that has been the ...
What are the distinct ways in which a set of predictor variables can provide information about a tar...
Quantifying synergy among stochastic variables is an important open problem in information theory. I...