In this paper, we generalize the results obtained with the Kullbackdistance (corresponding to empirical likelihood) and Cressie-Read metrics(generalized empirical likelihood) to general discrepancies, for someconvex functions satisfying a few regularity properties. In particular, weintroduce a new Bartlett correctable family of empirical discrepancies, theQuasi-Kullback, out of Cressie-Read family, which possess interesting nitesample properties. We conclude this work with some simulations in the multidimensionalcase for different discrepancies.
We focus on an important property upon generalization of the Kullback-Leibler divergence used in non...
We consider a very general class of empirical discrepancy statistics that includes the Cressie-Read ...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
International audienceIn this paper, we generalize the results obtained with the Kullback distance (...
We study some extensions of the empirical likelihood method, when theKullback distance is replaced b...
We review some recent extensions of the so-called generalized empirical likelihood method,...
International audienceWe study some extensions of the empirical likelihood method, when the Kullback...
We review some recent extensions of the so-called generalized empirical likelihood method, when the ...
Abstract: This paper investigates the family of empirical Cressie-Read discrepancy statistics with m...
This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics....
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
We focus on an important property upon generalization of the Kullback-Leibler divergence used in non...
We consider a very general class of empirical discrepancy statistics that includes the Cressie-Read ...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
International audienceIn this paper, we generalize the results obtained with the Kullback distance (...
We study some extensions of the empirical likelihood method, when theKullback distance is replaced b...
We review some recent extensions of the so-called generalized empirical likelihood method,...
International audienceWe study some extensions of the empirical likelihood method, when the Kullback...
We review some recent extensions of the so-called generalized empirical likelihood method, when the ...
Abstract: This paper investigates the family of empirical Cressie-Read discrepancy statistics with m...
This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics....
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
We focus on an important property upon generalization of the Kullback-Leibler divergence used in non...
We consider a very general class of empirical discrepancy statistics that includes the Cressie-Read ...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...