We consider the problem of identifying the causal direction between two discrete random variables using observational data. Unlike previous work, we keep the most general functional model but make an assumption on the unobserved exogenous variable: Inspired by Occam's razor, we assume that the exogenous variable is simple in the true causal direction. We quantify simplicity using Renyi entropy. Our main result is that, under natural assumptions, if the exogenous variable has low H0 entropy (cardinality) in the true direction, it must have high H0 entropy in the wrong direction. We establish several algorithmic hardness results about estimating the minimum entropy exogenous variable. We show that the problem of finding the exogenous variable...
This paper provides a new approach to recover relative entropy measures of contemporaneous dependenc...
We propose a new approach to infer the causal structure that has generated the observed statistical ...
Given two discrete random variables X and Y, with probability distributions p = (p(1), . . . , p(n))...
We study the problem of identifying the causal relationship between two discrete random variables fr...
It is a task of widespread interest to learn the underlying causal structure for systems of random v...
Abstract. The broad abundance of time series data, which is in sharp contrast to limited knowledge o...
We propose a new inference rule for estimating causal structure that underlies the observed statisti...
A central question for causal inference is to decide whether a set of correlations fits a given caus...
Transfer Entropy has been applied to experimental datasets to unveil causality between variables. In...
The broad abundance of time series data, which is in sharp contrast to limited knowledge of the unde...
A central question for causal inference is to decide whether a set of correlations fits a given caus...
We propose a new inference rule for estimating causal structure that underlies the observed statist...
The demands for machine learning and knowledge extraction methods have been booming due to the unpre...
Inferring the causal structure of a set of random variables from a finite sample of the joint distri...
Inferring the causal structure of a set of random variables from a finite sample of the joint distri...
This paper provides a new approach to recover relative entropy measures of contemporaneous dependenc...
We propose a new approach to infer the causal structure that has generated the observed statistical ...
Given two discrete random variables X and Y, with probability distributions p = (p(1), . . . , p(n))...
We study the problem of identifying the causal relationship between two discrete random variables fr...
It is a task of widespread interest to learn the underlying causal structure for systems of random v...
Abstract. The broad abundance of time series data, which is in sharp contrast to limited knowledge o...
We propose a new inference rule for estimating causal structure that underlies the observed statisti...
A central question for causal inference is to decide whether a set of correlations fits a given caus...
Transfer Entropy has been applied to experimental datasets to unveil causality between variables. In...
The broad abundance of time series data, which is in sharp contrast to limited knowledge of the unde...
A central question for causal inference is to decide whether a set of correlations fits a given caus...
We propose a new inference rule for estimating causal structure that underlies the observed statist...
The demands for machine learning and knowledge extraction methods have been booming due to the unpre...
Inferring the causal structure of a set of random variables from a finite sample of the joint distri...
Inferring the causal structure of a set of random variables from a finite sample of the joint distri...
This paper provides a new approach to recover relative entropy measures of contemporaneous dependenc...
We propose a new approach to infer the causal structure that has generated the observed statistical ...
Given two discrete random variables X and Y, with probability distributions p = (p(1), . . . , p(n))...