summary:The information divergence of a probability measure $P$ from an exponential family $\mathcal{E}$ over a finite set is defined as infimum of the divergences of $P$ from $Q$ subject to $Q\in \mathcal{E}$. All directional derivatives of the divergence from $\mathcal{E}$ are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for $P$ to be a maximizer of the divergence from $\mathcal{E}$ are presented, including new ones when $P$ is not projectable to $\mathcal{E}$
International audienceIn this work, we propose novel results for the optimization of divergences wit...
This paper introduces the $f$-EI$(\phi)$ algorithm, a novel iterative algorithm which operates on me...
The directed divergence of type β which generalizes Kullback's directed divergence or Information me...
summary:The information divergence of a probability measure $P$ from an exponential family $\mathcal...
The subject of this thesis is the maximization of the information divergence from an exponential fam...
The information divergence of a probability measure P from an exponential family E over a finite set...
summary:This article studies exponential families $\mathcal{E}$ on finite sets such that the informa...
summary:The problem to maximize the information divergence from an exponential family is generalized...
SUMMARY Explicit solution of the problem of maximization of information divergence from the family o...
We study the problem of maximizing information divergence from a new perspective using logarithmic V...
summary:We propose a simple method of construction of new families of $\phi$%-divergences. This meth...
We study optimal solutions to an abstract optimization problem for measures, which is a generalizati...
We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bo...
summary:This work studies the standard exponential families of probability measures on Euclidean spa...
summary:Stochastic interdependence of a probability distribution on a product space is measured by i...
International audienceIn this work, we propose novel results for the optimization of divergences wit...
This paper introduces the $f$-EI$(\phi)$ algorithm, a novel iterative algorithm which operates on me...
The directed divergence of type β which generalizes Kullback's directed divergence or Information me...
summary:The information divergence of a probability measure $P$ from an exponential family $\mathcal...
The subject of this thesis is the maximization of the information divergence from an exponential fam...
The information divergence of a probability measure P from an exponential family E over a finite set...
summary:This article studies exponential families $\mathcal{E}$ on finite sets such that the informa...
summary:The problem to maximize the information divergence from an exponential family is generalized...
SUMMARY Explicit solution of the problem of maximization of information divergence from the family o...
We study the problem of maximizing information divergence from a new perspective using logarithmic V...
summary:We propose a simple method of construction of new families of $\phi$%-divergences. This meth...
We study optimal solutions to an abstract optimization problem for measures, which is a generalizati...
We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bo...
summary:This work studies the standard exponential families of probability measures on Euclidean spa...
summary:Stochastic interdependence of a probability distribution on a product space is measured by i...
International audienceIn this work, we propose novel results for the optimization of divergences wit...
This paper introduces the $f$-EI$(\phi)$ algorithm, a novel iterative algorithm which operates on me...
The directed divergence of type β which generalizes Kullback's directed divergence or Information me...