summary:We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant $1-\gamma$. For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected values of the divergence behave in a similar way. These results serve as a reference to rank the approximation capabilities of other statistical models
International audienceWe consider fitting uncategorical data to a parametric family of distributions...
International audienceKullback–Leibler divergence is minimized among finite distributions with finit...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
summary:We compute the expected value of the Kullback-Leibler divergence of various fundamental stat...
Recently, a new entropy based divergence measure has been introduced which is much like Kullback-Lei...
We focus on an important property upon generalization of the Kullback-Leibler divergence used in non...
This paper studies the complexity of estimating Rényi divergences of discrete distributions: p obser...
We focus on an important property upon generalization of the Kullback-Leibler divergence used in non...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is rel...
Given two probability mass functions p(x) and q(x), D(p jj q), the Kullback-Leibler divergence (or r...
<p>The estimated Kullback-Leibler divergence between the eight species and the three random graph mo...
International audienceBased on rescaling by some suitable sequence instead of the number of time uni...
Comparing processes or models is of interest in various applications. Among the existing approaches,...
International audienceWe consider fitting uncategorical data to a parametric family of distributions...
International audienceKullback–Leibler divergence is minimized among finite distributions with finit...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
summary:We compute the expected value of the Kullback-Leibler divergence of various fundamental stat...
Recently, a new entropy based divergence measure has been introduced which is much like Kullback-Lei...
We focus on an important property upon generalization of the Kullback-Leibler divergence used in non...
This paper studies the complexity of estimating Rényi divergences of discrete distributions: p obser...
We focus on an important property upon generalization of the Kullback-Leibler divergence used in non...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is rel...
Given two probability mass functions p(x) and q(x), D(p jj q), the Kullback-Leibler divergence (or r...
<p>The estimated Kullback-Leibler divergence between the eight species and the three random graph mo...
International audienceBased on rescaling by some suitable sequence instead of the number of time uni...
Comparing processes or models is of interest in various applications. Among the existing approaches,...
International audienceWe consider fitting uncategorical data to a parametric family of distributions...
International audienceKullback–Leibler divergence is minimized among finite distributions with finit...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...