It is a common practice to adjust the number of hidden neurons in training, and the removal of neurons in neural networks plays an indispensable role in this architecture manipulation. In this paper, a succinct and unified mathematical form is upgraded to the generic case for removing neurons based on orthogonal projection and crosswise propagation in a feedforward layer with different architectures of neural networks, and further developed for several neural networks with different architectures. For a trained neural network, the method is divided into three stages. In the first stage, the output vectors of the feedforward observation layer are classified to clusters. In the second stage, the orthogonal projection is performed to locate a ...
In this study mathematical model order reduction is applied to a nonlinear model of a network of bio...
A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that...
The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limit...
A new method of pruning away hidden neurons in neural networks is presented in this paper. The hidde...
In this paper, the techniques of removing hidden neurons in cascade-correlation neural networks are ...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
In the design of neural networks, how to choose the proper size of a network for a given task is an ...
A critical question in the neural network research today concerns how many hidden neurons to use. Th...
Choosing the training algorithm and determining the architecture of artificial neural networks are v...
Using backpropagation algorithm(BP) to train neural networks is a widely adopted practice in both th...
The problem of determining the proper size of an artificial neural network is recognized to be cruci...
Abstract. This paper presents a new constructive method and pruning approaches to control the design...
The architecture of an artificial neural network has a great impact on the generalization power. M...
In the paper new non-conventional growing neural network is proposed. It coincides with the Cascade-...
Recurrent neural networks are attracting considerable interest within the neural network domain espe...
In this study mathematical model order reduction is applied to a nonlinear model of a network of bio...
A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that...
The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limit...
A new method of pruning away hidden neurons in neural networks is presented in this paper. The hidde...
In this paper, the techniques of removing hidden neurons in cascade-correlation neural networks are ...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
In the design of neural networks, how to choose the proper size of a network for a given task is an ...
A critical question in the neural network research today concerns how many hidden neurons to use. Th...
Choosing the training algorithm and determining the architecture of artificial neural networks are v...
Using backpropagation algorithm(BP) to train neural networks is a widely adopted practice in both th...
The problem of determining the proper size of an artificial neural network is recognized to be cruci...
Abstract. This paper presents a new constructive method and pruning approaches to control the design...
The architecture of an artificial neural network has a great impact on the generalization power. M...
In the paper new non-conventional growing neural network is proposed. It coincides with the Cascade-...
Recurrent neural networks are attracting considerable interest within the neural network domain espe...
In this study mathematical model order reduction is applied to a nonlinear model of a network of bio...
A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that...
The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limit...