We describe computational tasks - especially in vision - that correspond to compositional/hierarchical functions. While the universal approximation property holds both for hierarchical and shallow networks, we prove that deep (hierarchical) networks can approximate the class of compositional functions with the same accuracy as shallow networks but with exponentially lower VC-dimension as well as the number of training parameters. This leads to the question of approximation by sparse polynomials (in the number of independent parameters) and, as a consequence, by deep networks. We also discuss connections between our results and learnability of sparse Boolean functions, settling an old conjecture by Bengio.This work was supported by the Cente...
We contribute to a better understanding of the class of functions that can be represented by a neura...
Recently, researchers in the artificial neural network field have focused their attention on connect...
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Com...
Deep learning networks with convolution, pooling and subsampling are a special case of hierar- chica...
While the universal approximation property holds both for hierarchical and shallow networks, deep ne...
The paper reviews and extends an emerging body of theoretical results on deep learning including the...
The paper briefly reviews several recent results on hierarchical architectures for learning from exa...
[formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionali...
Recently there has been much interest in understanding why deep neural networks are preferred to sha...
© 2020 American Institute of Mathematical Sciences. All rights reserved. We show that deep networks ...
The paper characterizes classes of functions for which deep learning can be exponentially better tha...
We investigate the representational power of sum-product networks (computation networks analogous to...
How does a 110-layer ResNet learn a high-complexity classifier using relatively few training example...
The main success stories of deep learning, starting with ImageNet, depend on convolutional networks,...
© 2017, Springer Science+Business Media, LLC. We show how the success of deep learning could depend ...
We contribute to a better understanding of the class of functions that can be represented by a neura...
Recently, researchers in the artificial neural network field have focused their attention on connect...
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Com...
Deep learning networks with convolution, pooling and subsampling are a special case of hierar- chica...
While the universal approximation property holds both for hierarchical and shallow networks, deep ne...
The paper reviews and extends an emerging body of theoretical results on deep learning including the...
The paper briefly reviews several recent results on hierarchical architectures for learning from exa...
[formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionali...
Recently there has been much interest in understanding why deep neural networks are preferred to sha...
© 2020 American Institute of Mathematical Sciences. All rights reserved. We show that deep networks ...
The paper characterizes classes of functions for which deep learning can be exponentially better tha...
We investigate the representational power of sum-product networks (computation networks analogous to...
How does a 110-layer ResNet learn a high-complexity classifier using relatively few training example...
The main success stories of deep learning, starting with ImageNet, depend on convolutional networks,...
© 2017, Springer Science+Business Media, LLC. We show how the success of deep learning could depend ...
We contribute to a better understanding of the class of functions that can be represented by a neura...
Recently, researchers in the artificial neural network field have focused their attention on connect...
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Com...