We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities and present two novel classes of Hölder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy–Schwarz divergence. We report closed-form formulas for those statistical dissimilarities when considering distributions belonging to the same exponential family provided that the natural parameter space is a cone (e.g., multivariate Gaussians) or affine (e.g., categorical distributions). Those new classes of Hölder distances are invariant to rescaling and thus do not require dist...
Jensen–Shannon divergence is a well known multi-purpose measure of dissimilarity between probability...
summary:This paper deals with four types of point estimators based on minimization of information-th...
summary:The concept of $f$-divergences was introduced by Csiszár in 1963 as measures of the ‘hardnes...
We describe a framework to build distances by measuring the tightness of inequalities and introduce ...
summary:We propose a simple method of construction of new families of $\phi$%-divergences. This meth...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Divergence is a discrepancy measure between two objects, such as functions, vectors, matrices, and s...
In this paper, we propose a generalization of Rényi divergence, and then we investigate its induced ...
International audienceIn this paper, we generalize the results obtained with the Kullback distance (...
Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type ine...
This thesis documents three different contributions in statistical learning theory. They were develo...
Jensen–Shannon divergence is a well known multi-purpose measure of dissimilarity between probability...
summary:This paper deals with four types of point estimators based on minimization of information-th...
summary:The concept of $f$-divergences was introduced by Csiszár in 1963 as measures of the ‘hardnes...
We describe a framework to build distances by measuring the tightness of inequalities and introduce ...
summary:We propose a simple method of construction of new families of $\phi$%-divergences. This meth...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Divergence is a discrepancy measure between two objects, such as functions, vectors, matrices, and s...
In this paper, we propose a generalization of Rényi divergence, and then we investigate its induced ...
International audienceIn this paper, we generalize the results obtained with the Kullback distance (...
Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type ine...
This thesis documents three different contributions in statistical learning theory. They were develo...
Jensen–Shannon divergence is a well known multi-purpose measure of dissimilarity between probability...
summary:This paper deals with four types of point estimators based on minimization of information-th...
summary:The concept of $f$-divergences was introduced by Csiszár in 1963 as measures of the ‘hardnes...