Kernel density estimation is a technique for approximating probability distributions. Here, it is applied to the calculation of mutual information on a metric space. This is motivated by the problem in neuroscience of calculating the mutual information between stimuli and spiking responses; the space of these responses is a metric space. It is shown that kernel density estimation on a metric space resembles the k-nearest-neighbor approach. This approach is applied to a toy dataset designed to mimic electrophysiological data
While kernel methods are the basis of many popular techniques in supervised learning, they are less ...
In this paper we define distance functions for data sets (and distributions) in a RKHS context. To ...
A new family of kernels for statistical learning is introduced that exploits the geometric structur...
We investigate kernel density estimation where the kernel function varies from point to point. Densi...
We investigate kernel density estimation where the kernel function varies from point to point. Densi...
Conditional mutual information quantifies the conditional dependence of two random variables. It ext...
Abstract—This paper introduces a supervised metric learn-ing algorithm, called kernel density metric...
International audienceThis paper analyzes the kernel density estimation on spaces of Gaussian distri...
The machine learning field based on information theory has received a lot of attention in recent yea...
Abstract — Intracortical neural recordings are typically high-dimensional due to many electrodes, ch...
International audienceWhen nonlinear measures are estimated from sampled temporal signals with finit...
This research focuses on examining point pattern distributions over a network, therefore abandoning ...
We prove all randomized sampling methods produce outliers. Given a computable measure P over natural...
Abstract. In this paper, we suggest to model priors on human motion by means of nonparametric kernel...
We show that geometric inference of a point cloud can be calculated by examining its kernel density ...
While kernel methods are the basis of many popular techniques in supervised learning, they are less ...
In this paper we define distance functions for data sets (and distributions) in a RKHS context. To ...
A new family of kernels for statistical learning is introduced that exploits the geometric structur...
We investigate kernel density estimation where the kernel function varies from point to point. Densi...
We investigate kernel density estimation where the kernel function varies from point to point. Densi...
Conditional mutual information quantifies the conditional dependence of two random variables. It ext...
Abstract—This paper introduces a supervised metric learn-ing algorithm, called kernel density metric...
International audienceThis paper analyzes the kernel density estimation on spaces of Gaussian distri...
The machine learning field based on information theory has received a lot of attention in recent yea...
Abstract — Intracortical neural recordings are typically high-dimensional due to many electrodes, ch...
International audienceWhen nonlinear measures are estimated from sampled temporal signals with finit...
This research focuses on examining point pattern distributions over a network, therefore abandoning ...
We prove all randomized sampling methods produce outliers. Given a computable measure P over natural...
Abstract. In this paper, we suggest to model priors on human motion by means of nonparametric kernel...
We show that geometric inference of a point cloud can be calculated by examining its kernel density ...
While kernel methods are the basis of many popular techniques in supervised learning, they are less ...
In this paper we define distance functions for data sets (and distributions) in a RKHS context. To ...
A new family of kernels for statistical learning is introduced that exploits the geometric structur...