AbstractComplexity of Gaussian-radial-basis-function networks, with varying widths, is investigated. Upper bounds on rates of decrease of approximation errors with increasing number of hidden units are derived. Bounds are in terms of norms measuring smoothness (Bessel and Sobolev norms) multiplied by explicitly given functions a(r,d) of the number of variables d and degree of smoothness r. Estimates are proven using suitable integral representations in the form of networks with continua of hidden units computing scaled Gaussians and translated Bessel potentials. Consequences on tractability of approximation by Gaussian-radial-basis function networks are discussed
Feedforward networks are a class of approximation techniques that can be used to learn to perform so...
AbstractLet s ≥ d ≥ 1 be integers, 1 ≤ p < ∞. We investigate the degree of approximation of 2π-perio...
AbstractIn neural network theory the complexity of constructing networks to approximate input-output...
Feedforward networks together with their training algorithms are a class of regression techniques th...
Introduction Neural Networks consisting of localized basis functions are used for approximation of ...
Networks can be considered as approximation schemes. Multilayer networks of the backpropagation type...
We consider the approximation of smooth multivariate functions in C(IR d ) by feedforward neural n...
Abstract. We prove that neural networks with a single hidden layer are capable of providing an optim...
Abstract — Function approximation has been found in many applications. The radial basis function (RB...
We consider the approximation of smooth multivariate functions in C(R^d) by feedforward neural netwo...
In this paper, we bound the generalization error of a class of Radial Basis Function networks, for...
We had previously shown that regularization principles lead to approximation schemes which are equiv...
We analyze how radial basis functions are able to handle problems which are not linearly separable. ...
Abstract—We consider the approximation of smooth multivari-ate functions in C(IRd) by feedforward ne...
Let s ≥ 1 be an integer. A Gaussian network is a function on Rs of the form g(x) =∑Nk=1 ak exp(−‖x− ...
Feedforward networks are a class of approximation techniques that can be used to learn to perform so...
AbstractLet s ≥ d ≥ 1 be integers, 1 ≤ p < ∞. We investigate the degree of approximation of 2π-perio...
AbstractIn neural network theory the complexity of constructing networks to approximate input-output...
Feedforward networks together with their training algorithms are a class of regression techniques th...
Introduction Neural Networks consisting of localized basis functions are used for approximation of ...
Networks can be considered as approximation schemes. Multilayer networks of the backpropagation type...
We consider the approximation of smooth multivariate functions in C(IR d ) by feedforward neural n...
Abstract. We prove that neural networks with a single hidden layer are capable of providing an optim...
Abstract — Function approximation has been found in many applications. The radial basis function (RB...
We consider the approximation of smooth multivariate functions in C(R^d) by feedforward neural netwo...
In this paper, we bound the generalization error of a class of Radial Basis Function networks, for...
We had previously shown that regularization principles lead to approximation schemes which are equiv...
We analyze how radial basis functions are able to handle problems which are not linearly separable. ...
Abstract—We consider the approximation of smooth multivari-ate functions in C(IRd) by feedforward ne...
Let s ≥ 1 be an integer. A Gaussian network is a function on Rs of the form g(x) =∑Nk=1 ak exp(−‖x− ...
Feedforward networks are a class of approximation techniques that can be used to learn to perform so...
AbstractLet s ≥ d ≥ 1 be integers, 1 ≤ p < ∞. We investigate the degree of approximation of 2π-perio...
AbstractIn neural network theory the complexity of constructing networks to approximate input-output...