Abstract — A method to improve the generalization ability of a multilayered perceptron (MLP) network is proposed here. The method expands a given training set and trains an MLP with a new data set in each epoch. The method of data generation maintains the spatial density of the original training sample. Experiments show that the method can yield excellent generalization
A new approach to promote the generalization ability of neural networks is presented. It is based on...
Abstract:- Multi-layer perceptron (MLP) is widely used, because many problems can be reduced to appr...
Neural networks, particularly Multilayer Pereceptrons (MLPs) have been found to be successful for va...
Abstract—The response of a multilayered perceptron (MLP) network on points which are far away from t...
There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable a...
Abstract. Typically the response of a multilayered perceptron (MLP) network on points which are far ...
The back-propagation algorithm is mainly used for mul-tilayer perceptrons. This algorithm is, howeve...
Abstract. This paper presents a new constructive method and pruning approaches to control the design...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
金沢大学大学院自然科学研究科知能情報・数理A training data selection method is proposed for multilayer neural networks (ML...
The back-propagation algorithm is mainly used for multilayer perceptrons. This algorithm is, however...
We consider training classifiers for multiple tasks as a method for improving generalization and obt...
Multilayer feedforward neural networks with backpropagation algorithm have been used successfully in...
The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limit...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
A new approach to promote the generalization ability of neural networks is presented. It is based on...
Abstract:- Multi-layer perceptron (MLP) is widely used, because many problems can be reduced to appr...
Neural networks, particularly Multilayer Pereceptrons (MLPs) have been found to be successful for va...
Abstract—The response of a multilayered perceptron (MLP) network on points which are far away from t...
There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable a...
Abstract. Typically the response of a multilayered perceptron (MLP) network on points which are far ...
The back-propagation algorithm is mainly used for mul-tilayer perceptrons. This algorithm is, howeve...
Abstract. This paper presents a new constructive method and pruning approaches to control the design...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
金沢大学大学院自然科学研究科知能情報・数理A training data selection method is proposed for multilayer neural networks (ML...
The back-propagation algorithm is mainly used for multilayer perceptrons. This algorithm is, however...
We consider training classifiers for multiple tasks as a method for improving generalization and obt...
Multilayer feedforward neural networks with backpropagation algorithm have been used successfully in...
The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limit...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
A new approach to promote the generalization ability of neural networks is presented. It is based on...
Abstract:- Multi-layer perceptron (MLP) is widely used, because many problems can be reduced to appr...
Neural networks, particularly Multilayer Pereceptrons (MLPs) have been found to be successful for va...