Given a multilayer perceptron (MLP), there are functions that can be approximated up to any degree of accuracy by the MLP without having to increase the number of the hidden nodes. Those functions belong to the closure F̄ of the set F of the maps realizable by the MLP. In the paper, we give a list of maps with this property. In particular, it is proven that rationales belongs to F̄ for networks with arctangent activation function and exponentials belongs to F̄ for networks with sigmoid activation function. Moreover, for a restricted class of MLPs, we prove that the list is complete and give an analytic definition of F̄
This paper considers the approximation of sufficiently smooth multivariable functions with a multila...
This paper considers the approximation of sufficiently smooth multivariable functions with a multila...
Several researchers characterized the activation functions under which multilayer feedfor-ward netwo...
Given a multilayer perceptron (MLP), there are functions that can be approximated up to any degree o...
Given a multilayer perceptron (MLP), there are functions that can be approximated up to any degree o...
We investigate the approximation ability of a multilayer perceptron (MLP) network when it is extende...
AbstractApproximation properties of the MLP (multilayer feedforward perceptron) model of neural netw...
Abstract. We prove that neural networks with a single hidden layer are capable of providing an optim...
AbstractIt is shown that the general approximation property of feed-forward multilayer perceptron ne...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
Neural Networks are widely noticed to provide a nonlinear function approximation method. In order to...
This paper proposes a new method to reduce training time for neural nets used as function approximat...
We investigate the efficiency of approximation by linear combinations of ridge func-tions in the met...
This paper considers the approximation of sufficiently smooth multivariable functions with a multila...
This paper considers the approximation of sufficiently smooth multivariable functions with a multila...
This paper considers the approximation of sufficiently smooth multivariable functions with a multila...
Several researchers characterized the activation functions under which multilayer feedfor-ward netwo...
Given a multilayer perceptron (MLP), there are functions that can be approximated up to any degree o...
Given a multilayer perceptron (MLP), there are functions that can be approximated up to any degree o...
We investigate the approximation ability of a multilayer perceptron (MLP) network when it is extende...
AbstractApproximation properties of the MLP (multilayer feedforward perceptron) model of neural netw...
Abstract. We prove that neural networks with a single hidden layer are capable of providing an optim...
AbstractIt is shown that the general approximation property of feed-forward multilayer perceptron ne...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
Neural Networks are widely noticed to provide a nonlinear function approximation method. In order to...
This paper proposes a new method to reduce training time for neural nets used as function approximat...
We investigate the efficiency of approximation by linear combinations of ridge func-tions in the met...
This paper considers the approximation of sufficiently smooth multivariable functions with a multila...
This paper considers the approximation of sufficiently smooth multivariable functions with a multila...
This paper considers the approximation of sufficiently smooth multivariable functions with a multila...
Several researchers characterized the activation functions under which multilayer feedfor-ward netwo...