We show that the variational representations for f-divergences currently used in the litera-ture can be tightened. This has implications to a number of methods recently proposed based on this representation. As an exam-ple application we use our tighter represen-tation to derive a general f-divergence esti-mator based on two i.i.d. samples and derive the dual program for this estimator that per-forms well empirically. We also point out a connection between our estimator and MMD. 1
In this work, the probability of an event under some joint distribution is bounded by measuring it w...
Csiszár's f-divergence is a way to measure the similarity of two probability distributions. We study...
We consider extensions of the Shannon relative entropy, referred to as f-divergences. Three classica...
We show that the variational representations for f-divergences currently used in the literature can ...
This paper is focused on f-divergences, consisting of three main contributions. The first one introd...
This paper introduces the $\textit{variational Rényi bound}$ (VR) that extends traditional variation...
f-divergences are a general class of divergences between probability measures which include as speci...
We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this general...
We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this general...
We propose an approach for estimating f-divergences that exploits a new representa-tion of an f-dive...
This paper introduces the $f$-EI$(\phi)$ algorithm, a novel iterative algorithm which operates on me...
Given two probability measures P and Q and an event E, we provide bounds on P(E) in terms of Q(E) an...
We unify f-divergences, Bregman divergences, surrogate regret bounds, proper scoring rules, cost cur...
17 pagesWe propose new change of measure inequalities based on $f$-divergences (of which the Kullbac...
The Jensen's inequality plays a crucial role to obtain inequalities for divergences between probabil...
In this work, the probability of an event under some joint distribution is bounded by measuring it w...
Csiszár's f-divergence is a way to measure the similarity of two probability distributions. We study...
We consider extensions of the Shannon relative entropy, referred to as f-divergences. Three classica...
We show that the variational representations for f-divergences currently used in the literature can ...
This paper is focused on f-divergences, consisting of three main contributions. The first one introd...
This paper introduces the $\textit{variational Rényi bound}$ (VR) that extends traditional variation...
f-divergences are a general class of divergences between probability measures which include as speci...
We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this general...
We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this general...
We propose an approach for estimating f-divergences that exploits a new representa-tion of an f-dive...
This paper introduces the $f$-EI$(\phi)$ algorithm, a novel iterative algorithm which operates on me...
Given two probability measures P and Q and an event E, we provide bounds on P(E) in terms of Q(E) an...
We unify f-divergences, Bregman divergences, surrogate regret bounds, proper scoring rules, cost cur...
17 pagesWe propose new change of measure inequalities based on $f$-divergences (of which the Kullbac...
The Jensen's inequality plays a crucial role to obtain inequalities for divergences between probabil...
In this work, the probability of an event under some joint distribution is bounded by measuring it w...
Csiszár's f-divergence is a way to measure the similarity of two probability distributions. We study...
We consider extensions of the Shannon relative entropy, referred to as f-divergences. Three classica...