We examine a class of stochastic deep learning models with a tractable method to compute information-theoretic quantities. Our contributions are three-fold: (i) We show how entropies and mutual informations can be derived from heuristic statistical physics methods, under the assumption that weight matrices are independent and orthogonally-invariant. (ii) We extend particular cases in which this result is known to be rigorously exact by providing a proof for two-layers networks with Gaussian random weights, using the recently introduced adaptive interpolation method. (iii) We propose an experiment framework with generative models of synthetic datasets, on which we train deep neural networks with a weight constraint designed so that the assum...
As the era of big data arises, people get access to numerous amounts of multi-view data. Measuring, ...
Deep Learning architectures give brilliant results in a large variety of fields, but a comprehensive...
to be presented at ICML2022 in Baltimore, MDInternational audienceMutual Information (MI) has been w...
We examine a class of stochastic deep learning models with a tractable method to compute information...
International audienceWe examine a class of stochastic deep learning models with a tractable method ...
International audienceWe examine a class of stochastic deep learning models with a tractable method ...
International audienceWe examine a class of stochastic deep learning models with a tractable method ...
There is a need to better understand how generalization works in a deep learning model. The goal of ...
Mutual Information (MI) has been widely used as a loss regularizer for training neural networks. Thi...
Entropy and conditional mutual information are the key quantities information theory provides to mea...
This chapter discusses the role of information theory for analysis of neural networks using differen...
The practical successes of deep neural networks have not been matched by theoretical progress that s...
We show that information theoretic quantities can be used to control and describe the training proce...
As the era of big data arises, people get access to numerous amounts of multi-view data. Measuring, ...
Copyright © 2019 ASME We study the estimation of the mutual information I(X;Tℓ) between the input X ...
As the era of big data arises, people get access to numerous amounts of multi-view data. Measuring, ...
Deep Learning architectures give brilliant results in a large variety of fields, but a comprehensive...
to be presented at ICML2022 in Baltimore, MDInternational audienceMutual Information (MI) has been w...
We examine a class of stochastic deep learning models with a tractable method to compute information...
International audienceWe examine a class of stochastic deep learning models with a tractable method ...
International audienceWe examine a class of stochastic deep learning models with a tractable method ...
International audienceWe examine a class of stochastic deep learning models with a tractable method ...
There is a need to better understand how generalization works in a deep learning model. The goal of ...
Mutual Information (MI) has been widely used as a loss regularizer for training neural networks. Thi...
Entropy and conditional mutual information are the key quantities information theory provides to mea...
This chapter discusses the role of information theory for analysis of neural networks using differen...
The practical successes of deep neural networks have not been matched by theoretical progress that s...
We show that information theoretic quantities can be used to control and describe the training proce...
As the era of big data arises, people get access to numerous amounts of multi-view data. Measuring, ...
Copyright © 2019 ASME We study the estimation of the mutual information I(X;Tℓ) between the input X ...
As the era of big data arises, people get access to numerous amounts of multi-view data. Measuring, ...
Deep Learning architectures give brilliant results in a large variety of fields, but a comprehensive...
to be presented at ICML2022 in Baltimore, MDInternational audienceMutual Information (MI) has been w...