Sets of multivariable functions that can be approximated with \u201cdimension-independent\u201d rates either by linear approximators or by neural networks having various types of computational units are compared. The comparison is made by exhibiting families of functions belonging to suitable difference set
In this article, we present univariate and multivariate basic approximation by Kantorovich-Choquet t...
We show that sums of separable functions achieve similar error bound as single layer neural networks...
Convolutional neural networks are the most widely used type of neural networks in applications. In m...
Sets of multivariable functions that can be approximated with “dimension-independent” rates either b...
Neural networks provide a more flexible approximation of functions than traditional linear regressio...
Here we study the multivariate quantitative approximation of real valued continuous multivariate fun...
Here we study the multivariate quantitative approximation of real valued continuous multivariate fun...
A class of Soblove type multivariate function is approximated by feedforward network with one hidden...
AbstractLet q⩾1 be an integer, Q be a Borel subset of the Euclidean space Rq, μ be a probability mea...
Capabilities of linear and neural-network models are compared from the point of view of requirements...
In this article, we develop a framework for showing that neural networks can overcome the curse of d...
A feedforward neural net with d input neurons and with a single hidden layer of n neurons is given b...
We consider the approximation of smooth multivariate functions in C(IR d ) by feedforward neural n...
A feedforward neural net with d input neurons and with a single hidden layer of n neurons is given b...
We consider the approximation of smooth multivariate functions in C(R^d) by feedforward neural netwo...
In this article, we present univariate and multivariate basic approximation by Kantorovich-Choquet t...
We show that sums of separable functions achieve similar error bound as single layer neural networks...
Convolutional neural networks are the most widely used type of neural networks in applications. In m...
Sets of multivariable functions that can be approximated with “dimension-independent” rates either b...
Neural networks provide a more flexible approximation of functions than traditional linear regressio...
Here we study the multivariate quantitative approximation of real valued continuous multivariate fun...
Here we study the multivariate quantitative approximation of real valued continuous multivariate fun...
A class of Soblove type multivariate function is approximated by feedforward network with one hidden...
AbstractLet q⩾1 be an integer, Q be a Borel subset of the Euclidean space Rq, μ be a probability mea...
Capabilities of linear and neural-network models are compared from the point of view of requirements...
In this article, we develop a framework for showing that neural networks can overcome the curse of d...
A feedforward neural net with d input neurons and with a single hidden layer of n neurons is given b...
We consider the approximation of smooth multivariate functions in C(IR d ) by feedforward neural n...
A feedforward neural net with d input neurons and with a single hidden layer of n neurons is given b...
We consider the approximation of smooth multivariate functions in C(R^d) by feedforward neural netwo...
In this article, we present univariate and multivariate basic approximation by Kantorovich-Choquet t...
We show that sums of separable functions achieve similar error bound as single layer neural networks...
Convolutional neural networks are the most widely used type of neural networks in applications. In m...