Approximating a divergence between two probability distributions from their sam-ples is a fundamental challenge in the statistics, information theory, and machine learning communities, because a divergence estimator can be used for various pur-poses such as two-sample homogeneity testing, change-point detection, and class-balance estimation. Furthermore, an approximator of a divergence between the joint distribution and the product of marginals can be used for independence test-ing, which has a wide range of applications including feature selection and extrac-tion, clustering, object matching, independent component analysis, and causality learning. In this article, we review recent advances in direct divergence approxima-tion that follow th...
When it is acknowledged that all candidate parameterised statistical models are misspecified relativ...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
This thesis documents three different contributions in statistical learning theory. They were develo...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
Patrice Bertail (rapporteur), Denis Bosq (pésident), Michel Delecroix, Dominique Picard, Ya'acov Rit...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
This paper analyses the Contrastive Divergence algorithm for learning statistical parameters. We rel...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Divergence is a discrepancy measure between two objects, such as functions, vectors, matrices, and s...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
Generalisation error estimation is an important issue in machine learning. Cross-validation traditio...
The Kullback-Leibler (KL) divergence is one of the most fundamental metrics in information theory an...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
Csiszár's f-divergence is a way to measure the similarity of two probability distributions. We study...
When it is acknowledged that all candidate parameterised statistical models are misspecified relativ...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
This thesis documents three different contributions in statistical learning theory. They were develo...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
Patrice Bertail (rapporteur), Denis Bosq (pésident), Michel Delecroix, Dominique Picard, Ya'acov Rit...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
This paper analyses the Contrastive Divergence algorithm for learning statistical parameters. We rel...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Divergence is a discrepancy measure between two objects, such as functions, vectors, matrices, and s...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
Generalisation error estimation is an important issue in machine learning. Cross-validation traditio...
The Kullback-Leibler (KL) divergence is one of the most fundamental metrics in information theory an...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
Csiszár's f-divergence is a way to measure the similarity of two probability distributions. We study...
When it is acknowledged that all candidate parameterised statistical models are misspecified relativ...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
This thesis documents three different contributions in statistical learning theory. They were develo...