This paper proposes five partially recurrent neural networks architectures to evaluate the different roles played by interlayer and intralayer feedback connections in planning a temporal sequence of states. The first model has only one-to-one feedback connections from the output towards the input layer. This topology is taken as the reference one. The other models have interlayer and/or intralayer all-to-all feedback connections added to them. All feedback connections, but the one-to-one feedback links, are trainable. The models yield a sequence which take four blocks from an initial to a goal state, when these states are presented to the network. The models showed good performance for planning in different levels of complexity. The results...
WOS: 000244970900005PubMed ID: 17385626The architecture and training procedure of a novel recurrent ...
First a brief introduction to reinforcement learning and to supervised learning with recurrent netw...
In this chapter, we present three different recurrent neural network architectures that we employ fo...
This paper proposes five partially recurrent neural networks architectures to evaluate the different...
In this paper, we investigate the capabilities of local feedback multilayered networks, a particular...
This paper introduces a new class of dynamic multi layer perceptrons, called Block Feedback Neural ...
Recurrent Neural Networks (RNNs) are connectionist models that operate in discrete time using feedba...
This paper proposes use of feed-forward neural networks and external state feedback to produce an eq...
This paper concerns dynamic neural networks for signal processing: architectural issues are consider...
Recently, fully connected recurrent neural networks have been proven to be computationally rich --- ...
This paper suggests the use of Fourier-type activation functions in fully recurrent neural networks....
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasi...
In this paper, we discuss some properties of Block Feedback Neural Networks (B F N). In the first p...
Ph.D.Thesis, Computer Science Dept., U Rochester; Dana H. Ballard, thesis advisor; simultaneously pu...
A recurrent neural network, consisting of a small ensemble of eight processing units. In addition to...
WOS: 000244970900005PubMed ID: 17385626The architecture and training procedure of a novel recurrent ...
First a brief introduction to reinforcement learning and to supervised learning with recurrent netw...
In this chapter, we present three different recurrent neural network architectures that we employ fo...
This paper proposes five partially recurrent neural networks architectures to evaluate the different...
In this paper, we investigate the capabilities of local feedback multilayered networks, a particular...
This paper introduces a new class of dynamic multi layer perceptrons, called Block Feedback Neural ...
Recurrent Neural Networks (RNNs) are connectionist models that operate in discrete time using feedba...
This paper proposes use of feed-forward neural networks and external state feedback to produce an eq...
This paper concerns dynamic neural networks for signal processing: architectural issues are consider...
Recently, fully connected recurrent neural networks have been proven to be computationally rich --- ...
This paper suggests the use of Fourier-type activation functions in fully recurrent neural networks....
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasi...
In this paper, we discuss some properties of Block Feedback Neural Networks (B F N). In the first p...
Ph.D.Thesis, Computer Science Dept., U Rochester; Dana H. Ballard, thesis advisor; simultaneously pu...
A recurrent neural network, consisting of a small ensemble of eight processing units. In addition to...
WOS: 000244970900005PubMed ID: 17385626The architecture and training procedure of a novel recurrent ...
First a brief introduction to reinforcement learning and to supervised learning with recurrent netw...
In this chapter, we present three different recurrent neural network architectures that we employ fo...