The Kullback-Leibler divergence is a widespread dissimilarity measure between probability density functions, based on the Shannon entropy. Unfortunately, there is no analytic formula available to compute this divergence between mixture models, imposing the use of costly approximation algorithms. In order to reduce the computational burden when a lot of divergence evaluations are needed, we introduce a sub-class of the mixture models where the component parameters are shared between a set of mixtures and the only degree-of-freedom is the vector of weights of each mixture. This sharing allows to design extremely fast versions of existing dissimilarity measures between mixtures. We demonstrate the effectiveness of our approach by evaluating th...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback–Leibler diver...
International audienceThe Kullback-Leibler divergence is a widespread dis-similarity measure between...
As a probabilistic distance between two probability density functions, Kullback-Leibler divergence i...
In many applications, such as image retrieval and change detection, we need to assess the similarity...
Mixture distributions arise in many parametric and non-parametric settings—for example, in Gaussian ...
A mixture model for ordinal data modelling (denoted CUB) has been recently proposed in literature. S...
In this work we present two new methods for approximating the Kullback-Liebler (KL) divergence betwe...
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler dive...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
International audienceEstimators derived from the expectation‐maximization (EM) algorithm are not ro...
Approximating a divergence between two probability distributions from their sam-ples is a fundamenta...
Patrice Bertail (rapporteur), Denis Bosq (pésident), Michel Delecroix, Dominique Picard, Ya'acov Rit...
Abstract. We investigate the problem of estimating the proportion vector which maximizes the likelih...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback–Leibler diver...
International audienceThe Kullback-Leibler divergence is a widespread dis-similarity measure between...
As a probabilistic distance between two probability density functions, Kullback-Leibler divergence i...
In many applications, such as image retrieval and change detection, we need to assess the similarity...
Mixture distributions arise in many parametric and non-parametric settings—for example, in Gaussian ...
A mixture model for ordinal data modelling (denoted CUB) has been recently proposed in literature. S...
In this work we present two new methods for approximating the Kullback-Liebler (KL) divergence betwe...
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler dive...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
International audienceEstimators derived from the expectation‐maximization (EM) algorithm are not ro...
Approximating a divergence between two probability distributions from their sam-ples is a fundamenta...
Patrice Bertail (rapporteur), Denis Bosq (pésident), Michel Delecroix, Dominique Picard, Ya'acov Rit...
Abstract. We investigate the problem of estimating the proportion vector which maximizes the likelih...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback–Leibler diver...