We consider the computational complexity of learning by neural nets. We are inter- ested in how hard it is to design appropriate neural net architectures and to train neural nets for general and specialized learning tasks. Our main result shows that the training problem for 2-cascade neural nets (which have only two non-input nodes, one of which is hidden) is NP-complete, which implies that nding an optimal net (in terms of the number of non-input units) that is consistent with a set of exam- ples is also NP-complete. This result also demonstrates a surprising gap between the computational complexities of one-node (perceptron) and two-node neural net training problems, since the perceptron training problem can be solved in polynomial time ...
Neural Networks (NNs) struggle to efficiently learn certain problems, such as parity problems, even ...
Abstract: Various theoretical results show that learning in conventional feedforward neural networks...
This paper discusses within the framework of computational learning theory the current state of know...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
Given a neural network, training data, and a threshold, it was known that it is NP-hard to find weig...
We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold functions...
We survey some relationships between computational complexity and neural network theory. Here, only ...
AbstractWe consider the problem of efficiently learning in two-layer neural networks. We investigate...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
This paper discusses within the framework of computational learning theory the current state of know...
Abstract: We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold...
Abstract. It is shown that high-order feedforward neural nets of constant depth with piecewise-polyn...
The computational power of neural networks depends on properties of the real numbers used as weights...
It is well-known that neural networks are computationally hard to train. On the other hand, in pract...
. We survey some of the central results in the complexity theory of discrete neural networks, with ...
Neural Networks (NNs) struggle to efficiently learn certain problems, such as parity problems, even ...
Abstract: Various theoretical results show that learning in conventional feedforward neural networks...
This paper discusses within the framework of computational learning theory the current state of know...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
Given a neural network, training data, and a threshold, it was known that it is NP-hard to find weig...
We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold functions...
We survey some relationships between computational complexity and neural network theory. Here, only ...
AbstractWe consider the problem of efficiently learning in two-layer neural networks. We investigate...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
This paper discusses within the framework of computational learning theory the current state of know...
Abstract: We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold...
Abstract. It is shown that high-order feedforward neural nets of constant depth with piecewise-polyn...
The computational power of neural networks depends on properties of the real numbers used as weights...
It is well-known that neural networks are computationally hard to train. On the other hand, in pract...
. We survey some of the central results in the complexity theory of discrete neural networks, with ...
Neural Networks (NNs) struggle to efficiently learn certain problems, such as parity problems, even ...
Abstract: Various theoretical results show that learning in conventional feedforward neural networks...
This paper discusses within the framework of computational learning theory the current state of know...