We show that neural networks with three-times continuously differentiable activation functions are capable of computing a certain family of n-bit Boolean functions with two gates, whereas networks composed of binary threshold functions require at least \Omega\Gammaast n) gates. Thus, for a large class of activation functions, analog neural networks can be more powerful than discrete neural networks, even when computing Boolean functions. 1 Introduction. Artificial neural networks have become a popular model for machine learning and many results have been obtained regarding their application to practical problems. Typically, the network is trained to encode complex associations between inputs and outputs during supervised training cycles, wh...
We consider learning on multilayer neural nets with piecewise poly-nomial activation functions and a...
This paper aims to place neural networks in the context of boolean circuit complexity. We define app...
The finite discrete-time recurrent neural networks are also exploited for potentially infinite compu...
The paper will show that in order to obtain minimum size neural networks (i.e., size-optimal) for im...
Abstract. It is shown that high-order feedforward neural nets of constant depth with piecewise-polyn...
This paper starts by overviewing results dealing with the approximation capabilities of neural netwo...
The paper overviews results dealing with the approximation capabilities of neural networks, and boun...
We pursue a particular approach to analog computation, based on dynamical systems of the type used i...
This report surveys some connections between Boolean functions and artificial neural networks. The f...
Experimental evidence has shown analog neural networks to be ex-~mely fault-tolerant; in particular....
The paper overviews results dealing with the approximation capabilities of neural networks, as well ...
AbstractWe pursue a particular approach to analog computation, based on dynamical systems of the typ...
) Wolfgang Maass* Institute for Theoretical Computer Science Technische Universitaet Graz Klosterwie...
This paper reviews some of the recent results in applying the theory of Probably Approximately Corre...
This report surveys some connections between Boolean functions and artificial neural networks. The f...
We consider learning on multilayer neural nets with piecewise poly-nomial activation functions and a...
This paper aims to place neural networks in the context of boolean circuit complexity. We define app...
The finite discrete-time recurrent neural networks are also exploited for potentially infinite compu...
The paper will show that in order to obtain minimum size neural networks (i.e., size-optimal) for im...
Abstract. It is shown that high-order feedforward neural nets of constant depth with piecewise-polyn...
This paper starts by overviewing results dealing with the approximation capabilities of neural netwo...
The paper overviews results dealing with the approximation capabilities of neural networks, and boun...
We pursue a particular approach to analog computation, based on dynamical systems of the type used i...
This report surveys some connections between Boolean functions and artificial neural networks. The f...
Experimental evidence has shown analog neural networks to be ex-~mely fault-tolerant; in particular....
The paper overviews results dealing with the approximation capabilities of neural networks, as well ...
AbstractWe pursue a particular approach to analog computation, based on dynamical systems of the typ...
) Wolfgang Maass* Institute for Theoretical Computer Science Technische Universitaet Graz Klosterwie...
This paper reviews some of the recent results in applying the theory of Probably Approximately Corre...
This report surveys some connections between Boolean functions and artificial neural networks. The f...
We consider learning on multilayer neural nets with piecewise poly-nomial activation functions and a...
This paper aims to place neural networks in the context of boolean circuit complexity. We define app...
The finite discrete-time recurrent neural networks are also exploited for potentially infinite compu...