International audienceThe Kullback-Leibler divergence is a widespread dis-similarity measure between probability density functions , based on the Shannon entropy. Unfortunately, there is no analytic formula available to compute this divergence between mixture models, imposing the use of costly approximation algorithms. In order to reduce the computational burden when a lot of divergence evaluations are needed, we introduce a sub-class of the mixture models where the component parameters are shared between a set of mixtures and the only degree-of-freedom is the vector of weights of each mixture. This sharing allows to design extremely fast versions of existing dis-similarity measures between mixtures. We demonstrate the effectiveness of our ...
A mixture model for ordinal data modelling (denoted CUB) has been recently proposed in literature. S...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...
Patrice Bertail (rapporteur), Denis Bosq (pésident), Michel Delecroix, Dominique Picard, Ya'acov Rit...
International audienceThe Kullback-Leibler divergence is a widespread dis-similarity measure between...
The Kullback-Leibler divergence is a widespread dissimilarity measure between probability density fu...
In many applications, such as image retrieval and change detection, we need to assess the similarity...
As a probabilistic distance between two probability density functions, Kullback-Leibler divergence i...
Entropy-type measures for the heterogeneity of data have been used for a long time. In a mixture mo...
In this work we present two new methods for approximating the Kullback-Liebler (KL) divergence betwe...
Approximating a divergence between two probability distributions from their sam-ples is a fundamenta...
In many applications in biology, engineering and economics, identifying similarities and differences...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler dive...
In this work, we deliver a novel measure of similarity between Gaussian mixture models (GMMs) by nei...
Mixture distributions arise in many parametric and non-parametric settings—for example, in Gaussian ...
A mixture model for ordinal data modelling (denoted CUB) has been recently proposed in literature. S...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...
Patrice Bertail (rapporteur), Denis Bosq (pésident), Michel Delecroix, Dominique Picard, Ya'acov Rit...
International audienceThe Kullback-Leibler divergence is a widespread dis-similarity measure between...
The Kullback-Leibler divergence is a widespread dissimilarity measure between probability density fu...
In many applications, such as image retrieval and change detection, we need to assess the similarity...
As a probabilistic distance between two probability density functions, Kullback-Leibler divergence i...
Entropy-type measures for the heterogeneity of data have been used for a long time. In a mixture mo...
In this work we present two new methods for approximating the Kullback-Liebler (KL) divergence betwe...
Approximating a divergence between two probability distributions from their sam-ples is a fundamenta...
In many applications in biology, engineering and economics, identifying similarities and differences...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler dive...
In this work, we deliver a novel measure of similarity between Gaussian mixture models (GMMs) by nei...
Mixture distributions arise in many parametric and non-parametric settings—for example, in Gaussian ...
A mixture model for ordinal data modelling (denoted CUB) has been recently proposed in literature. S...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...
Patrice Bertail (rapporteur), Denis Bosq (pésident), Michel Delecroix, Dominique Picard, Ya'acov Rit...