AbstractWe investigate the computational power of recurrent neural networks that apply the sigmoid activation functionσ(x)=[2/(1+e−x)]−1. These networks are extensively used in automatic learning of non-linear dynamical behavior. We show that in the noiseless model, there exists a universal architecture that can be used to compute any recursive (Turing) function. This is the first result of its kind for the sigmoid activation function; previous techniques only applied to linearized and truncated version of this function. The significance of our result, besides the proving technique itself, lies in the popularity of the sigmoidal function both in engineering applications of artificial neural networks and in biological modelling. Our techniqu...
Abstract. This paper shows the existence of a finite neural network, made up of sigmoidal nen-rons, ...
In this paper a new neural network architecture, based on an adaptive activation function, called ge...
There has been a lot of interest in the use of discrete-time recurrent neural nets (DTRNN) to learn ...
AbstractHava Siegelmann and Eduardo Sontag have shown that recurrent neural networks using the linea...
We show how to use recursive function theory to prove Turing universality of finite analog recurrent...
We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and ou...
In this paper, we provide a historical survey of the most significant results concerning the computa...
Understanding the dynamical and computational capabilities of neural models represents an issue of c...
We present a complete overview of the computational power of recurrent neural networks involved in a...
. This paper shows the existence of a finite neural network, made up of sigmoidal neurons, which sim...
This article studies the computational power of various discontinuous real computational models that...
This paper studies the computational power of various discontinuous real computa-tional models that ...
Abstract. We present a general analysis of highly con-nected recurrent neural networks which are abl...
This paper reviews some of the recent results in applying the theory of Probably Approximately Corre...
We provide a characterization of the expressive powers of several models of nondeterministic recurre...
Abstract. This paper shows the existence of a finite neural network, made up of sigmoidal nen-rons, ...
In this paper a new neural network architecture, based on an adaptive activation function, called ge...
There has been a lot of interest in the use of discrete-time recurrent neural nets (DTRNN) to learn ...
AbstractHava Siegelmann and Eduardo Sontag have shown that recurrent neural networks using the linea...
We show how to use recursive function theory to prove Turing universality of finite analog recurrent...
We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and ou...
In this paper, we provide a historical survey of the most significant results concerning the computa...
Understanding the dynamical and computational capabilities of neural models represents an issue of c...
We present a complete overview of the computational power of recurrent neural networks involved in a...
. This paper shows the existence of a finite neural network, made up of sigmoidal neurons, which sim...
This article studies the computational power of various discontinuous real computational models that...
This paper studies the computational power of various discontinuous real computa-tional models that ...
Abstract. We present a general analysis of highly con-nected recurrent neural networks which are abl...
This paper reviews some of the recent results in applying the theory of Probably Approximately Corre...
We provide a characterization of the expressive powers of several models of nondeterministic recurre...
Abstract. This paper shows the existence of a finite neural network, made up of sigmoidal nen-rons, ...
In this paper a new neural network architecture, based on an adaptive activation function, called ge...
There has been a lot of interest in the use of discrete-time recurrent neural nets (DTRNN) to learn ...