We propose a notion of common information that allows one to quantify and separate the information that is shared between two random variables from the information that is unique to each. Our notion of common information is a variational relaxation of the G\'acs-K\"orner common information, which we recover as a special case, but is more amenable to optimization and can be approximated empirically using samples from the underlying distribution. We then provide a method to partition and quantify the common and unique information using a simple modification of a traditional variational auto-encoder. Empirically, we demonstrate that our formulation allows us to learn semantically meaningful common and unique factors of variation even on high-d...
We propose new measures of shared information, unique information and synergistic information that c...
This paper was accepted for publication to Machine Learning (Springer). Overfitting data is a well-k...
In the field of machine learning, it is still a critical issue to identify and supervise the learned ...
This thesis introduces the Mutual Information Machine (MIM), an autoencoder model for learning j...
Wyner’s common information was originally defined for a pair of dependent discrete random variables....
While the Hirschfeld-Gebelein-Rényi (HGR) maximal correlation and the Wyner common information share...
2018-08-14Mutual information (MI) has been successfully applied to a wide variety of domains due to ...
A new bimodal generative model is proposed for generating conditional and joint samples, accompanied...
We investigate the problem of learning representations that are invariant to certain nuisance or sen...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the ...
In the past few years Generative models have become an interesting topic in the field of Machine Lea...
This paper generalizes Wyner’s definition of common information of a pair or random variables to tha...
Information maximization is a common framework of unsupervised learning, which may be used for extr...
Abstract. One of the main notions of information theory is the notion of mutual information in two m...
We would like to learn a representation of the data that reflects the semantics behind a specific gr...
We propose new measures of shared information, unique information and synergistic information that c...
This paper was accepted for publication to Machine Learning (Springer). Overfitting data is a well-k...
In the field of machine learning, it is still a critical issue to identify and supervise the learned ...
This thesis introduces the Mutual Information Machine (MIM), an autoencoder model for learning j...
Wyner’s common information was originally defined for a pair of dependent discrete random variables....
While the Hirschfeld-Gebelein-Rényi (HGR) maximal correlation and the Wyner common information share...
2018-08-14Mutual information (MI) has been successfully applied to a wide variety of domains due to ...
A new bimodal generative model is proposed for generating conditional and joint samples, accompanied...
We investigate the problem of learning representations that are invariant to certain nuisance or sen...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the ...
In the past few years Generative models have become an interesting topic in the field of Machine Lea...
This paper generalizes Wyner’s definition of common information of a pair or random variables to tha...
Information maximization is a common framework of unsupervised learning, which may be used for extr...
Abstract. One of the main notions of information theory is the notion of mutual information in two m...
We would like to learn a representation of the data that reflects the semantics behind a specific gr...
We propose new measures of shared information, unique information and synergistic information that c...
This paper was accepted for publication to Machine Learning (Springer). Overfitting data is a well-k...
In the field of machine learning, it is still a critical issue to identify and supervise the learned ...