A feedforward neural net with d input neurons and with a single hidden layer of n neurons is given by[GRAPHICS]where a(j), theta(j), w(ji) is an element of R. In this paper we study the approximation of arbitrary functions f: R-d --> R by a neural net in an L-p(mu) norm for some finite measure mu on R-d. We prove that under natural moment conditions, a neural net with non-polynomial function can approximate any given function. (C) 1998 Elsevier Science Ltd. All rights reserved.</p
This paper examines the function approximation properties of the random neural-network model or GN...
This paper examines the function approximation properties of the random neural-network model or GN...
We generalize the classical universal approximation theorem for neural networks to the case of compl...
A feedforward neural net with d input neurons and with a single hidden layer of n neurons is given b...
AbstractApproximation properties of the MLP (multilayer feedforward perceptron) model of neural netw...
In this thesis we summarise several results in the literature which show the approximation capabilit...
In this paper, we present a review of some recent works on approximation by feedforward neural netwo...
In this paper, we present a review of some recent works on approximation by feedforward neural netwo...
In this paper, we present a review of some recent works on approximation by feedforward neural netwo...
International audienceWe study the fundamental limits to the expressive power of neural networks. Gi...
International audienceWe study the fundamental limits to the expressive power of neural networks. Gi...
We consider the approximation of smooth multivariate functions in C(IR d ) by feedforward neural n...
Abstract. We prove that neural networks with a single hidden layer are capable of providing an optim...
We consider the approximation of smooth multivariate functions in C(R^d) by feedforward neural netwo...
This paper examines the function approximation properties of the random neural-network model or GN...
This paper examines the function approximation properties of the random neural-network model or GN...
This paper examines the function approximation properties of the random neural-network model or GN...
We generalize the classical universal approximation theorem for neural networks to the case of compl...
A feedforward neural net with d input neurons and with a single hidden layer of n neurons is given b...
AbstractApproximation properties of the MLP (multilayer feedforward perceptron) model of neural netw...
In this thesis we summarise several results in the literature which show the approximation capabilit...
In this paper, we present a review of some recent works on approximation by feedforward neural netwo...
In this paper, we present a review of some recent works on approximation by feedforward neural netwo...
In this paper, we present a review of some recent works on approximation by feedforward neural netwo...
International audienceWe study the fundamental limits to the expressive power of neural networks. Gi...
International audienceWe study the fundamental limits to the expressive power of neural networks. Gi...
We consider the approximation of smooth multivariate functions in C(IR d ) by feedforward neural n...
Abstract. We prove that neural networks with a single hidden layer are capable of providing an optim...
We consider the approximation of smooth multivariate functions in C(R^d) by feedforward neural netwo...
This paper examines the function approximation properties of the random neural-network model or GN...
This paper examines the function approximation properties of the random neural-network model or GN...
This paper examines the function approximation properties of the random neural-network model or GN...
We generalize the classical universal approximation theorem for neural networks to the case of compl...