summary:We investigate the sets of joint probability distributions that maximize the average multi-information over a collection of margins. These functionals serve as proxies for maximizing the multi-information of a set of variables or the mutual information of two subsets of variables, at a lower computation and estimation complexity. We describe the maximizers and their relations to the maximizers of the multi-information and the mutual information
The maximal information coefficient (MIC) is a tool for finding the strongest pairwise relationships...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the ...
summary:We investigate the sets of joint probability distributions that maximize the average multi-i...
Information maximization is a common framework of unsupervised learning, which may be used for extr...
Abstract In this paper we consider mutual information for a pair of random variables and find a thi...
summary:Stochastic interdependence of a probability distribution on a product space is measured by i...
2018-08-14Mutual information (MI) has been successfully applied to a wide variety of domains due to ...
SUMMARY Explicit solution of the problem of maximization of information divergence from the family o...
The subject of this thesis is the maximization of the information divergence from an exponential fam...
The mutual information of two random variables ı and with joint probabilities {πij} is commonly us...
Abstract. One of the main notions of information theory is the notion of mutual information in two m...
The present paper1 aims to propose a new type of information-theoretic method to maximize mutual inf...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
There are various definitions of mutual information. Essentially, these definitions can be divided i...
The maximal information coefficient (MIC) is a tool for finding the strongest pairwise relationships...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the ...
summary:We investigate the sets of joint probability distributions that maximize the average multi-i...
Information maximization is a common framework of unsupervised learning, which may be used for extr...
Abstract In this paper we consider mutual information for a pair of random variables and find a thi...
summary:Stochastic interdependence of a probability distribution on a product space is measured by i...
2018-08-14Mutual information (MI) has been successfully applied to a wide variety of domains due to ...
SUMMARY Explicit solution of the problem of maximization of information divergence from the family o...
The subject of this thesis is the maximization of the information divergence from an exponential fam...
The mutual information of two random variables ı and with joint probabilities {πij} is commonly us...
Abstract. One of the main notions of information theory is the notion of mutual information in two m...
The present paper1 aims to propose a new type of information-theoretic method to maximize mutual inf...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
There are various definitions of mutual information. Essentially, these definitions can be divided i...
The maximal information coefficient (MIC) is a tool for finding the strongest pairwise relationships...
Reshef et al. recently proposed a new statistical measure, the “maximal information coefficient ” (M...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the ...