AbstractWe consider analog recurrent neural networks working on infinite input streams, provide a complete topological characterization of their expressive power, and compare it to the expressive power of classical infinite word reading abstract machines. More precisely, we consider analog recurrent neural networks as language recognizers over the Cantor space, and prove that the classes of ω-languages recognized by deterministic and non-deterministic analog networks correspond precisely to the respective classes of Π20-sets and Σ11-sets of the Cantor space. Furthermore, we show that the result can be generalized to more expressive analog networks equipped with any kind of Borel accepting condition. Therefore, in the deterministic case, the...
The authors present a general framework within which the computability of solutions to problems by v...
This paper studies the computational power of various discontinuous real computa-tional models that ...
We show that neural networks with three-times continuously differentiable activation functions are c...
International audienceWe consider analog recurrent neural networks working on in nite input streams,...
International audienceAnalog and evolving recurrent neural networks are super-Turing powerful. Here,...
AbstractWe pursue a particular approach to analog computation, based on dynamical systems of the typ...
"Artificial neural networks" provide an appealing model of computation. Such networks consist of an ...
We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and ou...
In this paper, we provide a historical survey of the most significant results concerning the computa...
We pursue a particular approach to analog computation, based on dynamical systems of the type used i...
AbstractThe model of analog recurrent neural networks (ARNN) is typically perceived as based on eith...
We introduce a model of nondeterministic hybrid recurrent neural networks – made up of Boolean input...
AbstractHava Siegelmann and Eduardo Sontag have shown that recurrent neural networks using the linea...
We provide a characterization of the expressive powers of several models of deterministic and nondet...
In classical computation, rational- and real-weighted recurrent neural networks were shown to be res...
The authors present a general framework within which the computability of solutions to problems by v...
This paper studies the computational power of various discontinuous real computa-tional models that ...
We show that neural networks with three-times continuously differentiable activation functions are c...
International audienceWe consider analog recurrent neural networks working on in nite input streams,...
International audienceAnalog and evolving recurrent neural networks are super-Turing powerful. Here,...
AbstractWe pursue a particular approach to analog computation, based on dynamical systems of the typ...
"Artificial neural networks" provide an appealing model of computation. Such networks consist of an ...
We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and ou...
In this paper, we provide a historical survey of the most significant results concerning the computa...
We pursue a particular approach to analog computation, based on dynamical systems of the type used i...
AbstractThe model of analog recurrent neural networks (ARNN) is typically perceived as based on eith...
We introduce a model of nondeterministic hybrid recurrent neural networks – made up of Boolean input...
AbstractHava Siegelmann and Eduardo Sontag have shown that recurrent neural networks using the linea...
We provide a characterization of the expressive powers of several models of deterministic and nondet...
In classical computation, rational- and real-weighted recurrent neural networks were shown to be res...
The authors present a general framework within which the computability of solutions to problems by v...
This paper studies the computational power of various discontinuous real computa-tional models that ...
We show that neural networks with three-times continuously differentiable activation functions are c...