We study the problem of identifying the causal relationship between two discrete random variables from observational data. We recently proposed a novel framework called entropie causality that works in a very general functional model but makes the assumption that the unobserved exogenous variable has small entropy in the true causal direction. This framework requires the solution of a minimum entropy coupling problem: Given marginal distributions of m discrete random variables, each on n states, find the joint distribution with minimum entropy, that respects the given marginals. This corresponds to minimizing a concave function of n^m variables over a convex polytope defined by nm linear constraints, called a transportation polytope. Unfort...
We propose a new inference rule for estimating causal structure that underlies the observed statist...
We develop a novel approach to approximate a specified collection of marginal distributions on subse...
A central question for causal inference is to decide whether a set of correlations fits a given caus...
We study the problem of identifying the causal relationship between two discrete random variables fr...
We consider the problem of identifying the causal direction between two discrete random variables us...
Given two discrete random variables X and Y, with probability distributions p = (p(1), . . . , p(n))...
Given two probability distributions p = (p_1 ,p_2 ,...,p_n ) and q = (q_1 ,q_2 ,...,q_m ) of two dis...
It is a task of widespread interest to learn the underlying causal structure for systems of random v...
Given two discrete random variables X and Y, with probability distributions p = (p1,..., pn) and q =...
Abstract. The broad abundance of time series data, which is in sharp contrast to limited knowledge o...
Entropy and conditional mutual information are the key quantities information theory provides to mea...
We propose a new inference rule for estimating causal structure that underlies the observed statisti...
We prove several fundamental statistical bounds for entropic OT with the squared Euclidean cost betw...
This paper provides a new approach to recover relative entropy measures of contemporaneous dependenc...
Transfer Entropy has been applied to experimental datasets to unveil causality between variables. In...
We propose a new inference rule for estimating causal structure that underlies the observed statist...
We develop a novel approach to approximate a specified collection of marginal distributions on subse...
A central question for causal inference is to decide whether a set of correlations fits a given caus...
We study the problem of identifying the causal relationship between two discrete random variables fr...
We consider the problem of identifying the causal direction between two discrete random variables us...
Given two discrete random variables X and Y, with probability distributions p = (p(1), . . . , p(n))...
Given two probability distributions p = (p_1 ,p_2 ,...,p_n ) and q = (q_1 ,q_2 ,...,q_m ) of two dis...
It is a task of widespread interest to learn the underlying causal structure for systems of random v...
Given two discrete random variables X and Y, with probability distributions p = (p1,..., pn) and q =...
Abstract. The broad abundance of time series data, which is in sharp contrast to limited knowledge o...
Entropy and conditional mutual information are the key quantities information theory provides to mea...
We propose a new inference rule for estimating causal structure that underlies the observed statisti...
We prove several fundamental statistical bounds for entropic OT with the squared Euclidean cost betw...
This paper provides a new approach to recover relative entropy measures of contemporaneous dependenc...
Transfer Entropy has been applied to experimental datasets to unveil causality between variables. In...
We propose a new inference rule for estimating causal structure that underlies the observed statist...
We develop a novel approach to approximate a specified collection of marginal distributions on subse...
A central question for causal inference is to decide whether a set of correlations fits a given caus...