Conditional mutual information quantifies the conditional dependence of two random variables. It extends mutual information to condition on the information encoded in a third variable. It has numerous applications. It requires a lot of data to estimate accurately and suffers the curse of dimensionality, limiting its application in analysing electrophysiological data, as well as in data science and machine learning. Here, a Kozachenko-Leonenko approximation is used to derive a nearest neighbours estimator. This estimator is derived using only the metric structure of the data and depends only on the distance between data points and not on the dimension of the data. This is an extension of previous work to conditional MI. Control over the d...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
Mutual information is used in a procedure to estimate time-delays between recordings of electroencep...
We assume that even the simplest model of the brain is nonlinear and ‘causal’. Proceeding with the ...
We propose a new estimator to measure directed dependencies in time series. The dimensionality of da...
Slides for a short talk given at CNS*2021 - Workshop for Methods of Information Theory in Computatio...
Mutual information is a measurable quantity of particular interest for several applications that int...
Kernel density estimation is a technique for approximating probability distributions. Here, it is ap...
We present a framework for quantifying the dynamics of information in coupled physiological systems ...
Each parameter ` in an abstract parameter space \Theta is associated with a different probability di...
Mutual information quantifies the determinism that exists in a relationship between random variables...
We present two classes of improved estimators for mutual information M(X,Y), from samples of random ...
Estimating conditional dependence between two random variables given the knowledge of a third random...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
Mutual information is fundamentally important for measuring statistical dependence between variables...
Mutual Information (MI) and Conditional Mutual Information (CMI) are multi-purpose tools from inform...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
Mutual information is used in a procedure to estimate time-delays between recordings of electroencep...
We assume that even the simplest model of the brain is nonlinear and ‘causal’. Proceeding with the ...
We propose a new estimator to measure directed dependencies in time series. The dimensionality of da...
Slides for a short talk given at CNS*2021 - Workshop for Methods of Information Theory in Computatio...
Mutual information is a measurable quantity of particular interest for several applications that int...
Kernel density estimation is a technique for approximating probability distributions. Here, it is ap...
We present a framework for quantifying the dynamics of information in coupled physiological systems ...
Each parameter ` in an abstract parameter space \Theta is associated with a different probability di...
Mutual information quantifies the determinism that exists in a relationship between random variables...
We present two classes of improved estimators for mutual information M(X,Y), from samples of random ...
Estimating conditional dependence between two random variables given the knowledge of a third random...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
Mutual information is fundamentally important for measuring statistical dependence between variables...
Mutual Information (MI) and Conditional Mutual Information (CMI) are multi-purpose tools from inform...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
Mutual information is used in a procedure to estimate time-delays between recordings of electroencep...
We assume that even the simplest model of the brain is nonlinear and ‘causal’. Proceeding with the ...