In [Meilijson and Ruppin, 1993] we presented a methodological framework describing the two-iteration performance of Hopfield-like attractor neural networks with history-dependent, Bayesian dynamics. We now extend this analysis in a number of directions: input patterns applied to small subsets of neurons, general con-nectivity architectures and more efficient use of history. We show that the optimal signal (activation) function has a slanted sigmQidal shape, and provide an intuitive account of activation functions with a non-monotone shape. This function endows the model with some properties characteristic of cortical neurons ' firing.
We investigate the dynamics of continuous attractor neural networks (CANNs). Due to the translationa...
The work of this thesis concerns how cortical memories are stored and retrieved. In particular, larg...
Two issues concerning the application of continuous attractors in neural systems are investigated: t...
We examine the performance of Hebbian-like attractor neural net-works, recalling stored memory patte...
A conventional view of information processing by line (manifold) attractor networks holds that they ...
We describe a modified attractor neural network in which neuronal dynamics takes place on a time sca...
The analysis is restricted to the features of neural networks endowed to the latter by the inborn (n...
INTRODUCTION Autoassociative attractor neural networks 1,2 provide a powerful paradigm for the st...
We analyse the behaviour of an attractor neural network which exhibits low mean temporal activity le...
In the context of learning in attractor neural networks (ANN) we discuss the issue of the constraint...
Coupling local, slowly adapting variables to an attractor network allows to destabilize all attracto...
<div><p>Many cognitive and motor functions are enabled by the temporal representation and processing...
By adapting an attractor neural network to an appropriate training overlap, the authors optimize its...
As can be represented by neurons and their synaptic connections, attractor networks are widely belie...
Attractor neural networks such as the Hopfield model can be used to model associative memory. An eff...
We investigate the dynamics of continuous attractor neural networks (CANNs). Due to the translationa...
The work of this thesis concerns how cortical memories are stored and retrieved. In particular, larg...
Two issues concerning the application of continuous attractors in neural systems are investigated: t...
We examine the performance of Hebbian-like attractor neural net-works, recalling stored memory patte...
A conventional view of information processing by line (manifold) attractor networks holds that they ...
We describe a modified attractor neural network in which neuronal dynamics takes place on a time sca...
The analysis is restricted to the features of neural networks endowed to the latter by the inborn (n...
INTRODUCTION Autoassociative attractor neural networks 1,2 provide a powerful paradigm for the st...
We analyse the behaviour of an attractor neural network which exhibits low mean temporal activity le...
In the context of learning in attractor neural networks (ANN) we discuss the issue of the constraint...
Coupling local, slowly adapting variables to an attractor network allows to destabilize all attracto...
<div><p>Many cognitive and motor functions are enabled by the temporal representation and processing...
By adapting an attractor neural network to an appropriate training overlap, the authors optimize its...
As can be represented by neurons and their synaptic connections, attractor networks are widely belie...
Attractor neural networks such as the Hopfield model can be used to model associative memory. An eff...
We investigate the dynamics of continuous attractor neural networks (CANNs). Due to the translationa...
The work of this thesis concerns how cortical memories are stored and retrieved. In particular, larg...
Two issues concerning the application of continuous attractors in neural systems are investigated: t...