A new method of pruning away hidden neurons in neural networks is presented in this paper. The hidden neuron is removed by analyzing the orthogonal projection correlations among the outputs of other hidden neurons. The method guarantees the least loss of weight information in terms of orthogonal projection. The remaining weights and thresholds are updated based on the weight crosswise propagation. A practical technique for penalizing the superfluous hidden neurons is explored. Retraining is needed after pruning. Extensive experiments are conducted, and the results demonstrate that the method gives better initial points for retraining and retraining costs less epochs.Computer Science, Artificial IntelligenceSCI(E)0ARTICLE157-681
[[abstract]]The fault tolerance of the multi-layer perceptron is strongly related to its redundant h...
Abstract Multilayer neural networks trained with backpropagation are in general not robust against t...
Several neural network architectures have been developed over the past several years. One of the mos...
It is a common practice to adjust the number of hidden neurons in training, and the removal of neuro...
In this paper, the techniques of removing hidden neurons in cascade-correlation neural networks are ...
In the design of neural networks, how to choose the proper size of a network for a given task is an ...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
The architecture of an artificial neural network has a great impact on the generalization power. M...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
Choosing the training algorithm and determining the architecture of artificial neural networks are v...
A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that...
The number of hidden layers is crucial in multilayer artificial neural networks. In general, general...
There are several papers on pruning methods in the artificial neural networks area. However, with ra...
Graduation date: 1990Under certain conditions, a neural network may be trained to perform a\ud speci...
Using backpropagation algorithm(BP) to train neural networks is a widely adopted practice in both th...
[[abstract]]The fault tolerance of the multi-layer perceptron is strongly related to its redundant h...
Abstract Multilayer neural networks trained with backpropagation are in general not robust against t...
Several neural network architectures have been developed over the past several years. One of the mos...
It is a common practice to adjust the number of hidden neurons in training, and the removal of neuro...
In this paper, the techniques of removing hidden neurons in cascade-correlation neural networks are ...
In the design of neural networks, how to choose the proper size of a network for a given task is an ...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
The architecture of an artificial neural network has a great impact on the generalization power. M...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
Choosing the training algorithm and determining the architecture of artificial neural networks are v...
A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that...
The number of hidden layers is crucial in multilayer artificial neural networks. In general, general...
There are several papers on pruning methods in the artificial neural networks area. However, with ra...
Graduation date: 1990Under certain conditions, a neural network may be trained to perform a\ud speci...
Using backpropagation algorithm(BP) to train neural networks is a widely adopted practice in both th...
[[abstract]]The fault tolerance of the multi-layer perceptron is strongly related to its redundant h...
Abstract Multilayer neural networks trained with backpropagation are in general not robust against t...
Several neural network architectures have been developed over the past several years. One of the mos...