We propose an independence criterion based on the eigenspectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm of the cross-covariance operator (we term this a Hilbert-Schmidt Independence Criterion, or HSIC). This approach has several advantages, compared with previous kernel-based independence criteria. First, the empirical estimate is simpler than any other kernel dependence test, and requires no user-defined regularisation. Second, there is a clearly defined population quantity which the empirical estimate approaches in the large sample limit, with exponential convergence guaranteed between the two: this ensures that independence tests based on {me...
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with em...
We introduce two new functionals, the constrained covariance and the kernel mutual information, to ...
Many machine learning algorithms can be formulated in the framework of statistical independence such...
We propose an independence criterion based on the eigenspectrum of covariance operators in reproduci...
Although kernel measures of independence have been widely applied in machine learning (notably in ke...
Whereas kernel measures of independence have been widely applied in machine learning (notably in ker...
The Hilbert-Schmidt Independence Criterion (HSIC) is a dependence measure based on reproducing kerne...
We introduce two new functionals, the constrained covariance and the kernel mutual information, to m...
Kernel techniques are among the most popular and powerful approaches of data science. Among the key ...
© 2017, Springer-Verlag Berlin Heidelberg. Measures of statistical dependence between random variabl...
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with em...
A statistical test of independence may be constructed using the Hilbert-Schmidt Independence Criteri...
Dependence measures based on reproducing kernel Hilbert spaces, also known as Hilbert-Schmidt Indepe...
This paper introduces a nonlinear measure of dependence between random variables in the context of r...
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with em...
We introduce two new functionals, the constrained covariance and the kernel mutual information, to ...
Many machine learning algorithms can be formulated in the framework of statistical independence such...
We propose an independence criterion based on the eigenspectrum of covariance operators in reproduci...
Although kernel measures of independence have been widely applied in machine learning (notably in ke...
Whereas kernel measures of independence have been widely applied in machine learning (notably in ker...
The Hilbert-Schmidt Independence Criterion (HSIC) is a dependence measure based on reproducing kerne...
We introduce two new functionals, the constrained covariance and the kernel mutual information, to m...
Kernel techniques are among the most popular and powerful approaches of data science. Among the key ...
© 2017, Springer-Verlag Berlin Heidelberg. Measures of statistical dependence between random variabl...
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with em...
A statistical test of independence may be constructed using the Hilbert-Schmidt Independence Criteri...
Dependence measures based on reproducing kernel Hilbert spaces, also known as Hilbert-Schmidt Indepe...
This paper introduces a nonlinear measure of dependence between random variables in the context of r...
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with em...
We introduce two new functionals, the constrained covariance and the kernel mutual information, to ...
Many machine learning algorithms can be formulated in the framework of statistical independence such...