This paper proves that the task of computing near-optimal weights for sigmoidal nodes under the L 1 regression norm is NP-Hard. For the special case where the sigmoid is piecewise-linear we prove aslightly stronger result, namely that computing the optimal weights is NP-Hard. These results parallel that for the one-node pattern recognition problem, namely that determining the optimal weights for a threshold logic node is also intractable. Our results have important consequences for constructive algorithms that build a regression model one node at a time. It suggests that although such methods are (in principle) capable of producing e cient size representations (e.g. see Barron (1993) � Jones (1992)), nding such representations may be comput...
AbstractLearning real weights for a McCulloch-Pitts neuron is equivalent to linear programming and c...
In this paper we consider the problem of learning a linear threshold function (a halfspace in n dime...
Given any linear threshold function f on n Boolean vari-ables, we construct a linear threshold funct...
Abstract: We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold...
We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold functions...
Sample complexity results from computational learning theory, when applied to neural network learnin...
When training perceptrons (linear classifiers) not only the performance on the training set is impor...
The back-propagation learning algorithm for multi-layered neural networks, which is often successful...
Feedforward nets with sigmoidal activation functions are often designed by minimizing a cost criteri...
Hammer B. Training a sigmoidal network is difficult. In: Verleysen M, ed. European Symposium on Arti...
AbstractIt is well known that (McCulloch-Pitts) neurons are efficiently trainable to learn an unknow...
We demonstrate that the problem of training neural networks with small (average) squared error is co...
This paper shows that if a large neural network is used for a pattern classification problem, and th...
Linear threshold elements are the basic building blocks of artificial neural networks. A linear thre...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
AbstractLearning real weights for a McCulloch-Pitts neuron is equivalent to linear programming and c...
In this paper we consider the problem of learning a linear threshold function (a halfspace in n dime...
Given any linear threshold function f on n Boolean vari-ables, we construct a linear threshold funct...
Abstract: We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold...
We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold functions...
Sample complexity results from computational learning theory, when applied to neural network learnin...
When training perceptrons (linear classifiers) not only the performance on the training set is impor...
The back-propagation learning algorithm for multi-layered neural networks, which is often successful...
Feedforward nets with sigmoidal activation functions are often designed by minimizing a cost criteri...
Hammer B. Training a sigmoidal network is difficult. In: Verleysen M, ed. European Symposium on Arti...
AbstractIt is well known that (McCulloch-Pitts) neurons are efficiently trainable to learn an unknow...
We demonstrate that the problem of training neural networks with small (average) squared error is co...
This paper shows that if a large neural network is used for a pattern classification problem, and th...
Linear threshold elements are the basic building blocks of artificial neural networks. A linear thre...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
AbstractLearning real weights for a McCulloch-Pitts neuron is equivalent to linear programming and c...
In this paper we consider the problem of learning a linear threshold function (a halfspace in n dime...
Given any linear threshold function f on n Boolean vari-ables, we construct a linear threshold funct...