We review the use of global and local methods for estimating a function mapping from samples of the func-tion containing noise. The relationship between the methods is examined and an empirical comparison is performed using the multi-layer perceptron (MLP) global neural network model, the single nearest-neighbour model, a linear local approxima-tion (LA) model, and the following commonly used datasets: the Mackey-Glass chaotic time series, the Sunspot time series, British English Vowel data, TIMIT speech phonemes, build-ing energy prediction data, and the sonar dataset. We find that the simple local approximation models often outperform the MLP. No criterion such as classification/prediction, size of the training set, dimensionality of the ...
A new strategy for incremental building of multilayer feedforward neural networks is proposed in the...
The understanding of generalization in machine learning is in a state of flux. This is partly due to...
Traditional neural networks like multi-layered perceptrons (MLP) use example patterns, i.e., pairs o...
We review the use of global and local methods for estimating a function mapping R m ) R n from s...
Thesis (Ph. D.)--University of Hawaii at Manoa, 1992.Includes bibliographical references (leaves 144...
Approximation of high-dimensional functions is a challenge for neural networks due to the curse of d...
Abstract. Noise disturbance in training data prevents a good approxi-mation of a function by neural ...
This paper examines the function approximation properties of the "random neural network model&q...
The focus of this paper is on the neural network modelling approach that has gained increasing recog...
We present a hybrid radial basis function (RBF) sigmoid neural network with a three-step training al...
This paper examines the function approximation properties of the random neural-network model or GN...
This paper examines the function approximation properties of the random neural-network model or GN...
In this dissertation, we have investigated the representational power of multilayer feedforward neur...
Capabilities of linear and neural-network models are compared from the point of view of requirements...
Neural networks provide a more flexible approximation of functions than traditional linear regressio...
A new strategy for incremental building of multilayer feedforward neural networks is proposed in the...
The understanding of generalization in machine learning is in a state of flux. This is partly due to...
Traditional neural networks like multi-layered perceptrons (MLP) use example patterns, i.e., pairs o...
We review the use of global and local methods for estimating a function mapping R m ) R n from s...
Thesis (Ph. D.)--University of Hawaii at Manoa, 1992.Includes bibliographical references (leaves 144...
Approximation of high-dimensional functions is a challenge for neural networks due to the curse of d...
Abstract. Noise disturbance in training data prevents a good approxi-mation of a function by neural ...
This paper examines the function approximation properties of the "random neural network model&q...
The focus of this paper is on the neural network modelling approach that has gained increasing recog...
We present a hybrid radial basis function (RBF) sigmoid neural network with a three-step training al...
This paper examines the function approximation properties of the random neural-network model or GN...
This paper examines the function approximation properties of the random neural-network model or GN...
In this dissertation, we have investigated the representational power of multilayer feedforward neur...
Capabilities of linear and neural-network models are compared from the point of view of requirements...
Neural networks provide a more flexible approximation of functions than traditional linear regressio...
A new strategy for incremental building of multilayer feedforward neural networks is proposed in the...
The understanding of generalization in machine learning is in a state of flux. This is partly due to...
Traditional neural networks like multi-layered perceptrons (MLP) use example patterns, i.e., pairs o...