This work explores the sensitivity of mutual information (MI) flow in hidden layers of very deep neural networks (DNNs) as a function of the initialization variance. Specifically, we demonstrate that information-bottleneck (IB) interpretations of DNNs are significantly affected by their choice of nonlinearity as well as weight and bias variances. Initialization on the network mean field (MF) edge of chaos (EOC) results in maximal information propagation through layers of even DNNs; consequently their IB plots are effectively single points which do not vary and high accuracy is rapidly obtained with training. Alternatively, initialization away from EOC results in loss of MI through depth and the more characteristic IB plots observed in the l...
Deep Learning (DL) networks are recent revolutionary developments in artificial intelligence researc...
Copyright © 2019 ASME We study the estimation of the mutual information I(X;Tℓ) between the input X ...
Autonomous, randomly coupled, neural networks display a transition to chaos at a critical coupling s...
The practical successes of deep neural networks have not been matched by theoretical progress that s...
Although deep neural networks (DNNs) have made remarkable achievementsin various elds, there is stil...
While rate distortion theory compresses data under a distortion constraint, information bottleneck (...
This paper presents a method to explain how the information of each input variable is gradually disc...
The Information Bottleneck theory provides a theoretical and computational framework for finding app...
Recurrent networks of randomly coupled rate neurons display a transition to chaos at a critical coup...
Deep feedforward networks initialized along the edge of chaos exhibit exponentially superior trainin...
Deep Learning (DL) networks are recent revolutionary developments in artificial intelligence researc...
We investigate information processing in randomly connected recurrent neural networks. It has been s...
In solving challenging pattern recognition problems, deep neural networks have shown excellent perfo...
We show that the input correlation matrix of typical classification datasets has an eigenspectrum wh...
We consider the problem of identifying the most influential nodes for a spreading process on a netwo...
Deep Learning (DL) networks are recent revolutionary developments in artificial intelligence researc...
Copyright © 2019 ASME We study the estimation of the mutual information I(X;Tℓ) between the input X ...
Autonomous, randomly coupled, neural networks display a transition to chaos at a critical coupling s...
The practical successes of deep neural networks have not been matched by theoretical progress that s...
Although deep neural networks (DNNs) have made remarkable achievementsin various elds, there is stil...
While rate distortion theory compresses data under a distortion constraint, information bottleneck (...
This paper presents a method to explain how the information of each input variable is gradually disc...
The Information Bottleneck theory provides a theoretical and computational framework for finding app...
Recurrent networks of randomly coupled rate neurons display a transition to chaos at a critical coup...
Deep feedforward networks initialized along the edge of chaos exhibit exponentially superior trainin...
Deep Learning (DL) networks are recent revolutionary developments in artificial intelligence researc...
We investigate information processing in randomly connected recurrent neural networks. It has been s...
In solving challenging pattern recognition problems, deep neural networks have shown excellent perfo...
We show that the input correlation matrix of typical classification datasets has an eigenspectrum wh...
We consider the problem of identifying the most influential nodes for a spreading process on a netwo...
Deep Learning (DL) networks are recent revolutionary developments in artificial intelligence researc...
Copyright © 2019 ASME We study the estimation of the mutual information I(X;Tℓ) between the input X ...
Autonomous, randomly coupled, neural networks display a transition to chaos at a critical coupling s...