Understanding the dynamical and computational capabilities of neural models represents an issue of central importance. Here, we consider a model of first-order recurrent neural networks provided with the possibility to evolve over time and involved in a basic interactive and memory active computational paradigm. In this context, we prove that the so-called interactive evolving recurrent neural networks are computationally equivalent to interactive Turing machines with advice, hence capable of super-Turing potentialities. We further provide a precise characterisation of the ω-translations realised by these networks. Therefore, the consideration of evolving capabilities in a first-order neural model provides the potentiality to break the Turi...
Abstract—Deterministic behavior can be modeled conveniently in the framework of finite automata. We ...
In neural computation, the essential information is generally encoded into the neurons via their spi...
Humans are able to form internal representations of the information they process – a capability wh...
Understanding the dynamical and computational capabilities of neural models represents an issue of c...
Understanding the dynamical and computational capabilities of neural models represents an issue of c...
We present a complete overview of the computational power of recurrent neural networks involved in a...
In classical computation, rational- and real-weighted recurrent neural networks were shown to be res...
In this paper, we provide a historical survey of the most significant results concerning the computa...
We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and ou...
We provide a characterization of the expressive powers of several models of deterministic and nondet...
We provide a characterization of the expressive powers of several models of nondeterministic recurre...
International audienceAnalog and evolving recurrent neural networks are super-Turing powerful. Here,...
Recent work by Siegelmann has shown that the computational power of recurrent neural networks matche...
International audienceComputation is classically studied in terms of automata, formal languages and ...
A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a...
Abstract—Deterministic behavior can be modeled conveniently in the framework of finite automata. We ...
In neural computation, the essential information is generally encoded into the neurons via their spi...
Humans are able to form internal representations of the information they process – a capability wh...
Understanding the dynamical and computational capabilities of neural models represents an issue of c...
Understanding the dynamical and computational capabilities of neural models represents an issue of c...
We present a complete overview of the computational power of recurrent neural networks involved in a...
In classical computation, rational- and real-weighted recurrent neural networks were shown to be res...
In this paper, we provide a historical survey of the most significant results concerning the computa...
We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and ou...
We provide a characterization of the expressive powers of several models of deterministic and nondet...
We provide a characterization of the expressive powers of several models of nondeterministic recurre...
International audienceAnalog and evolving recurrent neural networks are super-Turing powerful. Here,...
Recent work by Siegelmann has shown that the computational power of recurrent neural networks matche...
International audienceComputation is classically studied in terms of automata, formal languages and ...
A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a...
Abstract—Deterministic behavior can be modeled conveniently in the framework of finite automata. We ...
In neural computation, the essential information is generally encoded into the neurons via their spi...
Humans are able to form internal representations of the information they process – a capability wh...