We have recently investigated a new way of conceptualizing the inferential capacities of non-linear re-current networks in terms of the change in a statistic of the population activity over the time since a stimulus is presented. Evaluating change allows inferences to be fast, relatively insensitive to noise, and, in suitable cases, invariant to irrelevant dimensions of the stimulus. We proved the technique in the context of the bisection task, which is a popular psychophysical testbed for visual hyperacuity, using recurrent weights whose values were determined by hand. One central observation was that a wide range of structurally different sets of recurrent weights supports near-optimal behaviour. This suggests that a learning algorithm co...
Residual eye movements introduce positional variation of stimuli on the retina in different trials o...
Weight modifications in traditional neural nets are computed by hard-wired algorithms. Without excep...
A, Schematic of a network with dense feedforward and sparse recurrent connectivity. B, Learning impr...
One standard interpretation of networks of cortical neurons is that they form dynamical attractors. ...
Neural circuits are responsible for carrying out cortical computations. These computations consist o...
International audienceFrom decision making to perception to language, predicting what is coming next...
The inputs to photoreceptors tend to change rapidly over time, whereas physical parameters (e.g. sur...
Rumelhart, Hinton and Williams [Rumelhart et al. 86] describe a learning procedure for layered netwo...
This study investigates a population decoding paradigm, in which the estimation of stimulus in the p...
To function effectively, brains need to make predictions about their environment based on past exper...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
Deep feedforward neural network models of vision dominate in both computational neuroscience and eng...
A fundamental task for both biological perception systems and human-engineered agents is to infer un...
Ph.D.Thesis, Computer Science Dept., U Rochester; Dana H. Ballard, thesis advisor; simultaneously pu...
Agents living in volatile environments must be able to detect changes in contingencies while refrain...
Residual eye movements introduce positional variation of stimuli on the retina in different trials o...
Weight modifications in traditional neural nets are computed by hard-wired algorithms. Without excep...
A, Schematic of a network with dense feedforward and sparse recurrent connectivity. B, Learning impr...
One standard interpretation of networks of cortical neurons is that they form dynamical attractors. ...
Neural circuits are responsible for carrying out cortical computations. These computations consist o...
International audienceFrom decision making to perception to language, predicting what is coming next...
The inputs to photoreceptors tend to change rapidly over time, whereas physical parameters (e.g. sur...
Rumelhart, Hinton and Williams [Rumelhart et al. 86] describe a learning procedure for layered netwo...
This study investigates a population decoding paradigm, in which the estimation of stimulus in the p...
To function effectively, brains need to make predictions about their environment based on past exper...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
Deep feedforward neural network models of vision dominate in both computational neuroscience and eng...
A fundamental task for both biological perception systems and human-engineered agents is to infer un...
Ph.D.Thesis, Computer Science Dept., U Rochester; Dana H. Ballard, thesis advisor; simultaneously pu...
Agents living in volatile environments must be able to detect changes in contingencies while refrain...
Residual eye movements introduce positional variation of stimuli on the retina in different trials o...
Weight modifications in traditional neural nets are computed by hard-wired algorithms. Without excep...
A, Schematic of a network with dense feedforward and sparse recurrent connectivity. B, Learning impr...