Ignoring the samples far away from the training samples, our study team gives a new norm-based derivative process of localized generalization error boundary. Enlightened by the above research, this paper proposes a new method to construct radial basis function neural networks, which minimizes the sum of training error and stochastic sensitivity. Experimental results show that the new method can lead to simple and better network architecture
Abstract. This paper describes a method of supervised learning based on forward selection branching....
It has been shown that the selection of the most similar training patterns to generalize a new sampl...
[[abstract]]Feedforward neural networks have demonstrated an ability to learn arbitrary nonlinear ma...
Proceeding of: International Conference Artificial Neural Networks — ICANN 2001. Vienna, Austria, Au...
We analyze how radial basis functions are able to handle problems which are not linearly separable. ...
Conventionally, a radial basis function (RBF) network is constructed by obtaining cluster centers of...
For learning problem of Radial Basis Function Process Neural Network (RBF-PNN), an optimization trai...
Abstract — Function approximation has been found in many applications. The radial basis function (RB...
Proceeding of: 16th International Conference on Artificial Neural Networks, ICANN 2006. Athens, Gree...
In intelligent control applications, neural models and controllers are usually designed by performin...
Radial basis function (RBF) neural network is constructed of certain number of RBF neurons, and thes...
Radial basis function networks (RBFNs) have gained widespread appeal amongst researchers and have sh...
The problem of training a radial basis function (RBF) neural network for distinguishing two disjoint...
Learning from examples plays a central role in artificial neural networks. The success of many learn...
Artificial neural networks are powerfultools for analysing information expressed as data sets, which...
Abstract. This paper describes a method of supervised learning based on forward selection branching....
It has been shown that the selection of the most similar training patterns to generalize a new sampl...
[[abstract]]Feedforward neural networks have demonstrated an ability to learn arbitrary nonlinear ma...
Proceeding of: International Conference Artificial Neural Networks — ICANN 2001. Vienna, Austria, Au...
We analyze how radial basis functions are able to handle problems which are not linearly separable. ...
Conventionally, a radial basis function (RBF) network is constructed by obtaining cluster centers of...
For learning problem of Radial Basis Function Process Neural Network (RBF-PNN), an optimization trai...
Abstract — Function approximation has been found in many applications. The radial basis function (RB...
Proceeding of: 16th International Conference on Artificial Neural Networks, ICANN 2006. Athens, Gree...
In intelligent control applications, neural models and controllers are usually designed by performin...
Radial basis function (RBF) neural network is constructed of certain number of RBF neurons, and thes...
Radial basis function networks (RBFNs) have gained widespread appeal amongst researchers and have sh...
The problem of training a radial basis function (RBF) neural network for distinguishing two disjoint...
Learning from examples plays a central role in artificial neural networks. The success of many learn...
Artificial neural networks are powerfultools for analysing information expressed as data sets, which...
Abstract. This paper describes a method of supervised learning based on forward selection branching....
It has been shown that the selection of the most similar training patterns to generalize a new sampl...
[[abstract]]Feedforward neural networks have demonstrated an ability to learn arbitrary nonlinear ma...