Mixture distributions arise in many parametric and non-parametric settings—for example, in Gaussian mixture models and in non-parametric estimation. It is often necessary to compute the entropy of a mixture, but, in most cases, this quantity has no closed-form expression, making some form of approximation necessary. We propose a family of estimators based on a pairwise distance function between mixture components, and show that this estimator class has many attractive properties. For many distributions of interest, the proposed estimators are efficient to compute, differentiable in the mixture parameters, and become exact when the mixture components are clustered. We prove this family includes lower and upper bounds on the mixture entropy. ...
In this report, we introduce the minimum Hellinger distance (MHD) estimation method and review its h...
In clustering of high-dimensional data a variable selection is commonly applied to obtain an accurat...
International audienceThe Kullback-Leibler divergence is a widespread dis-similarity measure between...
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler dive...
There are efficient software programs for extracting from image sequences certain mixtures of distri...
Mixtures of distributions frequently appear in the theory and applications of proba-bility and stati...
The Kullback-Leibler divergence is a widespread dissimilarity measure between probability density fu...
We present two classes of improved estimators for mutual information M(X,Y), from samples of random ...
peer reviewedWe observe a n-sample, the distribution of which is assumed to belong, or at least to b...
A class of estimators of the Rényi and Tsallis entropies of an unknown distribution f in Rm is prese...
Abstract. We investigate the problem of estimating the proportion vector which maximizes the likelih...
For many practical probability density representations such as for the widely used Gaussian mixture ...
We study the rates of convergence of the maximum likelihood esti-mator (MLE) and posterior distribut...
Abstract. In this paper we propose a new distance metric for probability den-sity functions (PDF). T...
. We investigate the problem of estimating the proportion vector which maximizes the likelihood of a...
In this report, we introduce the minimum Hellinger distance (MHD) estimation method and review its h...
In clustering of high-dimensional data a variable selection is commonly applied to obtain an accurat...
International audienceThe Kullback-Leibler divergence is a widespread dis-similarity measure between...
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler dive...
There are efficient software programs for extracting from image sequences certain mixtures of distri...
Mixtures of distributions frequently appear in the theory and applications of proba-bility and stati...
The Kullback-Leibler divergence is a widespread dissimilarity measure between probability density fu...
We present two classes of improved estimators for mutual information M(X,Y), from samples of random ...
peer reviewedWe observe a n-sample, the distribution of which is assumed to belong, or at least to b...
A class of estimators of the Rényi and Tsallis entropies of an unknown distribution f in Rm is prese...
Abstract. We investigate the problem of estimating the proportion vector which maximizes the likelih...
For many practical probability density representations such as for the widely used Gaussian mixture ...
We study the rates of convergence of the maximum likelihood esti-mator (MLE) and posterior distribut...
Abstract. In this paper we propose a new distance metric for probability den-sity functions (PDF). T...
. We investigate the problem of estimating the proportion vector which maximizes the likelihood of a...
In this report, we introduce the minimum Hellinger distance (MHD) estimation method and review its h...
In clustering of high-dimensional data a variable selection is commonly applied to obtain an accurat...
International audienceThe Kullback-Leibler divergence is a widespread dis-similarity measure between...