Deep neural networks progressively transform their inputs across multiple processing layers. What are the geometrical properties of the representations learned by these networks? Here we study the intrinsic dimensionality (ID) of data-representations, i.e. the minimal number of parameters needed to describe a representation. We find that, in a trained network, the ID is orders of magnitude smaller than the number of units in each layer. Across layers, the ID first increases and then progressively decreases in the final layers. Remarkably, the ID of the last hidden layer predicts classification accuracy on the test set. These results can neither be found by linear dimensionality estimates (e.g., with principal component analysis), nor in rep...
We show that deep ReLU neural network classifiers can see a low-dimensional Riemannian manifold stru...
One of the important challenges today in deep learning is explaining the outstanding power of genera...
The ongoing exponential rise in recording capacity calls for new approaches for analysing and interp...
Appears at NeurIPS 2021International audienceDisobeying the classical wisdom of statistical learning...
Across scientific and engineering disciplines, the algorithmic pipeline forprocessing and understand...
Deep learning algorithms are responsible for a technological revolution in a variety oftasks includi...
The information explosion of the past few decades has created tremendous opportunities for Machine L...
Understanding the reasons for the success of deep neural networks trained using stochastic gradient-...
This electronic version was submitted by the student author. The certified thesis is available in th...
Significant success of deep learning has brought unprecedented challenges to conventional wisdom in ...
A remarkable characteristic of overparameterized deep neural networks (DNNs) is that their accuracy ...
We show that deep ReLU neural network classifiers can see a low-dimensional Riemannian manifold stru...
Biological learning systems are outstanding in their ability to learn from limited training data com...
Deep neural networks (DNNs) defy the classical bias-variance trade-off: adding parameters to a DNN t...
One of the important challenges today in deep learning is explaining the outstanding power of genera...
We show that deep ReLU neural network classifiers can see a low-dimensional Riemannian manifold stru...
One of the important challenges today in deep learning is explaining the outstanding power of genera...
The ongoing exponential rise in recording capacity calls for new approaches for analysing and interp...
Appears at NeurIPS 2021International audienceDisobeying the classical wisdom of statistical learning...
Across scientific and engineering disciplines, the algorithmic pipeline forprocessing and understand...
Deep learning algorithms are responsible for a technological revolution in a variety oftasks includi...
The information explosion of the past few decades has created tremendous opportunities for Machine L...
Understanding the reasons for the success of deep neural networks trained using stochastic gradient-...
This electronic version was submitted by the student author. The certified thesis is available in th...
Significant success of deep learning has brought unprecedented challenges to conventional wisdom in ...
A remarkable characteristic of overparameterized deep neural networks (DNNs) is that their accuracy ...
We show that deep ReLU neural network classifiers can see a low-dimensional Riemannian manifold stru...
Biological learning systems are outstanding in their ability to learn from limited training data com...
Deep neural networks (DNNs) defy the classical bias-variance trade-off: adding parameters to a DNN t...
One of the important challenges today in deep learning is explaining the outstanding power of genera...
We show that deep ReLU neural network classifiers can see a low-dimensional Riemannian manifold stru...
One of the important challenges today in deep learning is explaining the outstanding power of genera...
The ongoing exponential rise in recording capacity calls for new approaches for analysing and interp...