Deep neural networks achieve stellar generalisation on a variety of problems, despite often being large enough to easily fit all their training data. Here we study the generalisation dynamics of two-layer neural networks in a teacher-student setup, where one network, the student, is trained using stochastic gradient descent (SGD) on data generated by another network, called the teacher. We show how for this problem, the dynamics of SGD are captured by a set of differential equations. In particular, we demonstrate analytically that the generalisation error of the student increases linearly with the network size, with other relevant parameters held constant. Our results indicate that achieving good generalisation in neural networks depends on...
We study on-line gradient-descent learning in multilayer networks analytically and numerically. The ...
In this work, we construct generalization bounds to understand existing learning algorithms and prop...
Machine learning, and in particular neural network models, have revolutionized fields such as image,...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
International audienceDeep neural networks achieve stellar generalisation even when they have enough...
International audienceDeep neural networks achieve stellar generalisation even when they have enough...
International audienceDeep neural networks achieve stellar generalisation even when they have enough...
The generalization mystery in deep learning is the following: Why do over-parameterized neural netwo...
Empirical studies show that gradient-based methods can learn deep neural networks (DNNs) with very g...
Empirical studies show that gradient-based methods can learn deep neural networks (DNNs) with very g...
We study the effect of regularization in an on-line gradient-descent learning scenario for a general...
In this thesis, we study model parameterization for deep learning applications. Part of the mathemat...
We study on-line gradient-descent learning in multilayer networks analytically and numerically. The ...
In this work, we construct generalization bounds to understand existing learning algorithms and prop...
Machine learning, and in particular neural network models, have revolutionized fields such as image,...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
International audienceDeep neural networks achieve stellar generalisation even when they have enough...
International audienceDeep neural networks achieve stellar generalisation even when they have enough...
International audienceDeep neural networks achieve stellar generalisation even when they have enough...
The generalization mystery in deep learning is the following: Why do over-parameterized neural netwo...
Empirical studies show that gradient-based methods can learn deep neural networks (DNNs) with very g...
Empirical studies show that gradient-based methods can learn deep neural networks (DNNs) with very g...
We study the effect of regularization in an on-line gradient-descent learning scenario for a general...
In this thesis, we study model parameterization for deep learning applications. Part of the mathemat...
We study on-line gradient-descent learning in multilayer networks analytically and numerically. The ...
In this work, we construct generalization bounds to understand existing learning algorithms and prop...
Machine learning, and in particular neural network models, have revolutionized fields such as image,...