We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with emphasis on constrained covariance (COCO), a novel criterion to test dependence of random variables. We show that COCO is a test for independence if and only if the associated RKHSs are universal. That said, no independence test exists that can distinguish dependent and independent random variables in all circumstances. Dependent random variables can result in a COCO which is arbitrarily close to zero when the source densities are highly non-smooth, which can make dependence hard to detect empirically. All current kernel-based independence tests share this behaviour. Finally, we demonstrate exponential convergence between the population and emp...
This thesis contributes to the field of nonparametric hypothesis testing (i.e. two-sample and indepe...
Dependence measures based on reproducing kernel Hilbert spaces, also known as Hilbert-Schmidt Indepe...
International audienceWe describe a novel non-parametric statistical hypothesis test of relative dep...
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with em...
We propose an independence criterion based on the eigenspectrum of covariance operators in reproduci...
We introduce two new functionals, the constrained covariance and the kernel mutual information, to m...
We propose a new measure of conditional dependence of random variables, based on normalized cross-co...
We introduce two new functionals, the constrained covariance and the kernel mutual information, to ...
A fundamental problem in neuroscience is determining whether or not particular neural signals are de...
Although kernel measures of independence have been widely applied in machine learning (notably in ke...
Whereas kernel measures of independence have been widely applied in machine learning (notably in ker...
International audienceTests of dependence are an important tool in statistical analysis, and are wid...
© 2017, Springer-Verlag Berlin Heidelberg. Measures of statistical dependence between random variabl...
This thesis contributes to the field of nonparametric hypothesis testing (i.e. two-sample and indepe...
Dependence measures based on reproducing kernel Hilbert spaces, also known as Hilbert-Schmidt Indepe...
International audienceWe describe a novel non-parametric statistical hypothesis test of relative dep...
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with em...
We propose an independence criterion based on the eigenspectrum of covariance operators in reproduci...
We introduce two new functionals, the constrained covariance and the kernel mutual information, to m...
We propose a new measure of conditional dependence of random variables, based on normalized cross-co...
We introduce two new functionals, the constrained covariance and the kernel mutual information, to ...
A fundamental problem in neuroscience is determining whether or not particular neural signals are de...
Although kernel measures of independence have been widely applied in machine learning (notably in ke...
Whereas kernel measures of independence have been widely applied in machine learning (notably in ker...
International audienceTests of dependence are an important tool in statistical analysis, and are wid...
© 2017, Springer-Verlag Berlin Heidelberg. Measures of statistical dependence between random variabl...
This thesis contributes to the field of nonparametric hypothesis testing (i.e. two-sample and indepe...
Dependence measures based on reproducing kernel Hilbert spaces, also known as Hilbert-Schmidt Indepe...
International audienceWe describe a novel non-parametric statistical hypothesis test of relative dep...