This is the final version. Available from ICLR via the link in this recordDeep neural networks (DNNs) generalize remarkably well without explicit regularization even in the strongly over-parametrized regime where classical learning theory would instead predict that they would severely overfit. While many proposals for some kind of implicit regularization have been made to rationalise this success, there is no consensus for the fundamental reason why DNNs do not strongly overfit. In this paper, we provide a new explanation. By applying a very general probability-complexity bound recently derived from algorithmic information theory (AIT), we argue that the parameter-function map of many DNNs should be exponentially biased towards simple f...
Deep neural networks (DNNs) defy the classical bias-variance trade-off: adding parameters to a DNN t...
Modern deep neural networks (DNNs) represent a formidable challenge for theorists: according to the ...
Modern deep neural networks (DNNs) represent a formidable challenge for theorists: according to the ...
Modern deep neural networks are highly over-parameterized compared to the data on which they are tra...
While there has been progress in developing non-vacuous generalization bounds for deep neural networ...
This paper provides theoretical insights into why and how deep learning can generalize well, despite...
The ability of deep neural networks to generalise well even when they interpolate their training dat...
With a direct analysis of neural networks, this paper presents a mathematically tight generalization...
Deep learning has transformed computer vision, natural language processing, and speech recognition. ...
Gradient-based deep-learning algorithms exhibit remarkable performance in practice, but it is not we...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
Over-parameterized deep neural networks (DNNs) with sufficient capacity to memorize random noise can...
In recent years Deep Neural Networks (DNNs) have achieved state-of-the-art results in many fields su...
The understanding of generalization in machine learning is in a state of flux. This is partly due to...
Modern deep neural networks (DNNs) represent a formidable challenge for theorists: according to the ...
Deep neural networks (DNNs) defy the classical bias-variance trade-off: adding parameters to a DNN t...
Modern deep neural networks (DNNs) represent a formidable challenge for theorists: according to the ...
Modern deep neural networks (DNNs) represent a formidable challenge for theorists: according to the ...
Modern deep neural networks are highly over-parameterized compared to the data on which they are tra...
While there has been progress in developing non-vacuous generalization bounds for deep neural networ...
This paper provides theoretical insights into why and how deep learning can generalize well, despite...
The ability of deep neural networks to generalise well even when they interpolate their training dat...
With a direct analysis of neural networks, this paper presents a mathematically tight generalization...
Deep learning has transformed computer vision, natural language processing, and speech recognition. ...
Gradient-based deep-learning algorithms exhibit remarkable performance in practice, but it is not we...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
Over-parameterized deep neural networks (DNNs) with sufficient capacity to memorize random noise can...
In recent years Deep Neural Networks (DNNs) have achieved state-of-the-art results in many fields su...
The understanding of generalization in machine learning is in a state of flux. This is partly due to...
Modern deep neural networks (DNNs) represent a formidable challenge for theorists: according to the ...
Deep neural networks (DNNs) defy the classical bias-variance trade-off: adding parameters to a DNN t...
Modern deep neural networks (DNNs) represent a formidable challenge for theorists: according to the ...
Modern deep neural networks (DNNs) represent a formidable challenge for theorists: according to the ...