In this paper we show that programming languages can be translated into recurrent (analog, rational weighted) neural nets. Implementation of programming languages in neural nets turns to be not only theoretical exciting, but has also some practical implications in the recent efforts to merge symbolic and sub symbolic computation. To be of some use, it should be carried in a context of bounded resources. Herein, we show how to use resource bounds to speed up computations over neural nets, through suitable data type coding like in the usual programming languages. We introduce data types and show how to code and keep them inside the information flow of neural nets. Data types and control structures are part of a suitable programming language c...
There are families of neural networks that can learn to compute any function, provided sufficient tr...
Recently researchers have derived formal complexity analysis of analog computation in the setting of...
Recent work by Siegelmann has shown that the computational power of recurrent neural networks matche...
In this paper we show that programming languages can be translated into recurrent (analog, rational ...
In a recent paper [Neto et al. 97] we showed that programming languages can be translated on recurre...
"Artificial neural networks" provide an appealing model of computation. Such networks consist of an ...
International audienceComputation is classically studied in terms of automata, formal languages and ...
Is thought computation over ideas? Turing, and many cognitive scientists since, have assumed so, and...
It has been one of the great challenges of neuro-symbolic integration to represent recursive logic p...
The compilation of high-level programming languages for parallel machines faces two challenges: maxi...
Studying symbolic computation in deep neural networks (DNNs) is essential for improving their explai...
This work investigates if the current neural architectures are adequate for learning symbolic rewrit...
The goal of neural-symbolic computation is to integrate ro-bust connectionist learning and sound sym...
Abstract: Rewriting systems are used in various areas of computer science, and especially in lambda-...
This paper deals with the simulation of Turing machines by neural networks. Such networks are made u...
There are families of neural networks that can learn to compute any function, provided sufficient tr...
Recently researchers have derived formal complexity analysis of analog computation in the setting of...
Recent work by Siegelmann has shown that the computational power of recurrent neural networks matche...
In this paper we show that programming languages can be translated into recurrent (analog, rational ...
In a recent paper [Neto et al. 97] we showed that programming languages can be translated on recurre...
"Artificial neural networks" provide an appealing model of computation. Such networks consist of an ...
International audienceComputation is classically studied in terms of automata, formal languages and ...
Is thought computation over ideas? Turing, and many cognitive scientists since, have assumed so, and...
It has been one of the great challenges of neuro-symbolic integration to represent recursive logic p...
The compilation of high-level programming languages for parallel machines faces two challenges: maxi...
Studying symbolic computation in deep neural networks (DNNs) is essential for improving their explai...
This work investigates if the current neural architectures are adequate for learning symbolic rewrit...
The goal of neural-symbolic computation is to integrate ro-bust connectionist learning and sound sym...
Abstract: Rewriting systems are used in various areas of computer science, and especially in lambda-...
This paper deals with the simulation of Turing machines by neural networks. Such networks are made u...
There are families of neural networks that can learn to compute any function, provided sufficient tr...
Recently researchers have derived formal complexity analysis of analog computation in the setting of...
Recent work by Siegelmann has shown that the computational power of recurrent neural networks matche...