In this paper the authors discuss several complexity aspects pertaining to neural networks, commonly known as the curse of dimensionality. The focus will be on: (1) size complexity and depth-size tradeoffs; (2) complexity of learning; and (3) precision and limited interconnectivity. Results have been obtained for each of these problems when dealt with separately, but few things are known as to the links among them. They start by presenting known results and try to establish connections between them. These show that they are facing very difficult problems--exponential growth in either space (i.e. precision and size) and/or time (i.e., learning and depth)--when resorting to neural networks for solving general problems. The paper will present ...
The acclaimed successes of neural networks often overshadow their tremendous complexity. We focus on...
Each year, deep learning demonstrates new and improved empirical results with deeper and wider neura...
The term `curse of dimensionality' is used to describe either the problems associated with the ...
The paper reviews and extends an emerging body of theoretical results on deep learning including the...
[formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionali...
We study the sample complexity of learning neural networks by providing new bounds on their Rademach...
One critical aspect neural network designers face today is choosing an appropriate network size for ...
A remarkable characteristic of overparameterized deep neural networks (DNNs) is that their accuracy ...
Recently, researchers in the artificial neural network field have focused their attention on connect...
Thesis (Ph.D.)--University of Washington, 2020Neural networks trained by machine learning optimizati...
People believe that depth plays an important role in success of deep neural networks (DNN). However,...
We solve an open question from Lu et al. (2017), by showing that any target network with inputs in $...
This is Chapter 2 of Part 1 of the book titled "Deep Learning": a nine-part easy-to-grasp textbook w...
The paper characterizes classes of functions for which deep learning can be exponentially better tha...
How does the size of a neural circuit influence its learning performance? Larger brains tend to be f...
The acclaimed successes of neural networks often overshadow their tremendous complexity. We focus on...
Each year, deep learning demonstrates new and improved empirical results with deeper and wider neura...
The term `curse of dimensionality' is used to describe either the problems associated with the ...
The paper reviews and extends an emerging body of theoretical results on deep learning including the...
[formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionali...
We study the sample complexity of learning neural networks by providing new bounds on their Rademach...
One critical aspect neural network designers face today is choosing an appropriate network size for ...
A remarkable characteristic of overparameterized deep neural networks (DNNs) is that their accuracy ...
Recently, researchers in the artificial neural network field have focused their attention on connect...
Thesis (Ph.D.)--University of Washington, 2020Neural networks trained by machine learning optimizati...
People believe that depth plays an important role in success of deep neural networks (DNN). However,...
We solve an open question from Lu et al. (2017), by showing that any target network with inputs in $...
This is Chapter 2 of Part 1 of the book titled "Deep Learning": a nine-part easy-to-grasp textbook w...
The paper characterizes classes of functions for which deep learning can be exponentially better tha...
How does the size of a neural circuit influence its learning performance? Larger brains tend to be f...
The acclaimed successes of neural networks often overshadow their tremendous complexity. We focus on...
Each year, deep learning demonstrates new and improved empirical results with deeper and wider neura...
The term `curse of dimensionality' is used to describe either the problems associated with the ...