The goal of this short note is to discuss the relation between Kullback--Leibler divergence and total variation distance, starting with the celebrated Pinsker's inequality relating the two, before switching to a simple, yet (arguably) more useful inequality, apparently not as well known, due to Bretagnolle and Huber. We also discuss applications of this bound for minimax testing lower bounds.Comment: Update: positive answer to Question 8 ("from the TFL to the BH bound"), communicated to me by Hao-Chung Chen
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
The Chernoff information between two probability measures is a statistical divergence measuring thei...
In this (short) note, we focus on two techniques used to prove lower bounds for distribution learnin...
We provide optimal lower and upper bounds for the augmented Kullback-Leibler divergence in terms of ...
For arbitrary two probability measures on real d-space with given means and variances (covariance ma...
We address problems (that have since been addressed) in a proofs-version of a paper by Eva, Hartmann...
We refine the recent breakthrough technique of Klartag and Lehec to obtain an improved polylogarithm...
The problem of estimating the Kullback-Leibler divergence D(P||Q) between two unknown distributions ...
The $\alpha$-divergences include the well-known Kullback-Leibler divergence, Hellinger distance and ...
We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bo...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
We generalise the classical Pinsker inequality which relates variational divergence to Kullback-Lieb...
We introduce a version of Stein's method of comparison of operators specifically tailored to the pro...
Total variation distance (TV distance) is a fundamental notion of distance between probability distr...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
The Chernoff information between two probability measures is a statistical divergence measuring thei...
In this (short) note, we focus on two techniques used to prove lower bounds for distribution learnin...
We provide optimal lower and upper bounds for the augmented Kullback-Leibler divergence in terms of ...
For arbitrary two probability measures on real d-space with given means and variances (covariance ma...
We address problems (that have since been addressed) in a proofs-version of a paper by Eva, Hartmann...
We refine the recent breakthrough technique of Klartag and Lehec to obtain an improved polylogarithm...
The problem of estimating the Kullback-Leibler divergence D(P||Q) between two unknown distributions ...
The $\alpha$-divergences include the well-known Kullback-Leibler divergence, Hellinger distance and ...
We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bo...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
We generalise the classical Pinsker inequality which relates variational divergence to Kullback-Lieb...
We introduce a version of Stein's method of comparison of operators specifically tailored to the pro...
Total variation distance (TV distance) is a fundamental notion of distance between probability distr...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
The Chernoff information between two probability measures is a statistical divergence measuring thei...
In this (short) note, we focus on two techniques used to prove lower bounds for distribution learnin...