SUMMARY Explicit solution of the problem of maximization of information divergence from the family of multinomial distribu-tions is presented. General problem of maximization of information divergence from an exponential family has emerged in probabilistic models for evolution and learning in neural networks, based on infomax principles. The maximizers admit interpretation as stochastic systems with high complexity w.r.t. exponential family.
This electronic version was submitted by the student author. The certified thesis is available in th...
Distributed learning of probabilistic models from multiple data repositories with minimum communicat...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...
The subject of this thesis is the maximization of the information divergence from an exponential fam...
Abstract. We review recent results about the maximal values of the Kullback-Leibler information dive...
summary:The problem to maximize the information divergence from an exponential family is generalized...
Abstract—Maximization of the information divergence from any hierarchical log-linear model is studie...
summary:The information divergence of a probability measure $P$ from an exponential family $\mathcal...
summary:Stochastic interdependence of a probability distribution on a product space is measured by i...
The present paper1 aims to propose a new type of information-theoretic method to maximize mutual inf...
summary:We investigate the sets of joint probability distributions that maximize the average multi-i...
We study the problem of maximizing information divergence from a new perspective using logarithmic V...
summary:This article studies exponential families $\mathcal{E}$ on finite sets such that the informa...
In the present paper, we propose a method to unify information maximization and minimization in hidd...
International audienceThis paper is devoted to the mathematical study of some divergences based on t...
This electronic version was submitted by the student author. The certified thesis is available in th...
Distributed learning of probabilistic models from multiple data repositories with minimum communicat...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...
The subject of this thesis is the maximization of the information divergence from an exponential fam...
Abstract. We review recent results about the maximal values of the Kullback-Leibler information dive...
summary:The problem to maximize the information divergence from an exponential family is generalized...
Abstract—Maximization of the information divergence from any hierarchical log-linear model is studie...
summary:The information divergence of a probability measure $P$ from an exponential family $\mathcal...
summary:Stochastic interdependence of a probability distribution on a product space is measured by i...
The present paper1 aims to propose a new type of information-theoretic method to maximize mutual inf...
summary:We investigate the sets of joint probability distributions that maximize the average multi-i...
We study the problem of maximizing information divergence from a new perspective using logarithmic V...
summary:This article studies exponential families $\mathcal{E}$ on finite sets such that the informa...
In the present paper, we propose a method to unify information maximization and minimization in hidd...
International audienceThis paper is devoted to the mathematical study of some divergences based on t...
This electronic version was submitted by the student author. The certified thesis is available in th...
Distributed learning of probabilistic models from multiple data repositories with minimum communicat...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...