AbstractWe formalize a notion of loading information into connectionist networks that characterizes the training of feed-forward neural networks. This problem is NP-complete, so we look for tractable subcases of the problem by placing constraints on the network architecture. The focus of these constraints is on various families of “shallow” architectures which are defined to have bounded depth and un-bounded width. We introduce a perspective on shallow networks, called the Support Cone Interaction (SCI) graph, which is helpful in distinguishing tractable from intractable subcases: When the SCI graph is a tree or is of limited bandwidth, loading can be accomplished in polynomial time; when its bandwidth is not limited we find the problem NP-...
DasGupta B, Hammer B. Hardness of approximation of the loading problem for multi-layered feedforward...
We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold functions...
A hallmark of graph neural networks is their ability to distinguish the isomorphism class of their i...
AbstractWe formalize a notion of loading information into connectionist networks that characterizes ...
The present work deals with one of the major and not yet completely understood topics of supervised ...
We deal with computational issues of loading a fixed-architecture neural network with a set of posit...
A long standing open problem in the theory of neural networks is the development of quantitative met...
Recently, researchers in the artificial neural network field have focused their attention on connect...
We develop a fast end-to-end method for training lightweightneural networks using multiple classifie...
The authors address the problem of choosing synaptic weights in a recursive (Hopfield) neural networ...
This paper deals with learnability of concept classes defined by neural networks, showing the hardne...
Recently, deep networks were proved to be more effective than shallow architectures to face complex ...
This paper studies the expressive power of graph neural networks falling within the message-passing ...
We investigate the complexity of the reachability problem for (deep) neuralnetworks: does it compute...
Abstract: We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold...
DasGupta B, Hammer B. Hardness of approximation of the loading problem for multi-layered feedforward...
We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold functions...
A hallmark of graph neural networks is their ability to distinguish the isomorphism class of their i...
AbstractWe formalize a notion of loading information into connectionist networks that characterizes ...
The present work deals with one of the major and not yet completely understood topics of supervised ...
We deal with computational issues of loading a fixed-architecture neural network with a set of posit...
A long standing open problem in the theory of neural networks is the development of quantitative met...
Recently, researchers in the artificial neural network field have focused their attention on connect...
We develop a fast end-to-end method for training lightweightneural networks using multiple classifie...
The authors address the problem of choosing synaptic weights in a recursive (Hopfield) neural networ...
This paper deals with learnability of concept classes defined by neural networks, showing the hardne...
Recently, deep networks were proved to be more effective than shallow architectures to face complex ...
This paper studies the expressive power of graph neural networks falling within the message-passing ...
We investigate the complexity of the reachability problem for (deep) neuralnetworks: does it compute...
Abstract: We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold...
DasGupta B, Hammer B. Hardness of approximation of the loading problem for multi-layered feedforward...
We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold functions...
A hallmark of graph neural networks is their ability to distinguish the isomorphism class of their i...