We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bound that includes part of the family of $f$-Divergences. The results are then applied to specific settings of interest and compared to other notable results in the literature. In particular, we show that the known bounds using Mutual Information can be improved by using, for example, Maximal Leakage, Hellinger divergence, or generalizations of the Hockey-Stick divergence.Comment: Submitted to ISIT 202
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled ...
International audienceWe propose a modified χβ-divergence, give some of its properties, and show tha...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
Given two probability measures P and Q and an event E, we provide bounds on P(E) in terms of Q(E) an...
For arbitrary two probability measures on real d-space with given means and variances (covariance ma...
Abstract—Lower bounds involving -divergences between the underlying probability measures are proved ...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
This thesis documents three different contributions in statistical learning theory. They were develo...
The von Mises-Fisher family is a parametric family of distributions on the surface of the unit ball,...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
In this work, the probability of an event under some joint distribution is bounded by measuring it w...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled ...
International audienceWe propose a modified χβ-divergence, give some of its properties, and show tha...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
Given two probability measures P and Q and an event E, we provide bounds on P(E) in terms of Q(E) an...
For arbitrary two probability measures on real d-space with given means and variances (covariance ma...
Abstract—Lower bounds involving -divergences between the underlying probability measures are proved ...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
This thesis documents three different contributions in statistical learning theory. They were develo...
The von Mises-Fisher family is a parametric family of distributions on the surface of the unit ball,...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
In this work, the probability of an event under some joint distribution is bounded by measuring it w...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled ...
International audienceWe propose a modified χβ-divergence, give some of its properties, and show tha...