The success of deep learning is founded on learning rules with biologically implausible properties, entailing high memory and energy costs. At the Donders Institute in Nijmegen, NL, we have developed GAIT-Prop, a learning method for large-scale neural networks that alleviates some of the biologically unrealistic attributes of conventional deep learning. By localising weight updates in space and time, our method reduces computational complexity and illustrates how powerful learning rules can be implemented within the constraints on connectivity and communication present in the brain.ISSN:0926-498
Systems neuroscience seeks explanations for how the brain implements a wide variety of perceptual, c...
Deep neural networks follow a pattern of connectivity that was loosely inspired by neurobiology. The...
Computational neuroscience is in the midst of constructing a new framework for understanding the bra...
The success of deep networks and recent industry involvement in brain-inspired computing is igniting...
In the field of machine learning, ‘deep-learning’ has become spectacularly successful very rapidly, ...
Weight-sharing is one of the pillars behind Convolutional Neural Networks and their successes. Howev...
Many of the recent advances in the field of artificial intelligence have been fueled by the highly s...
Energy-efficient learning and control are becoming increasingly crucial for robots that solve comple...
Researchers have proposed that deep learning, which is providing important progress in a wide range ...
Synapses and neural connectivity are plastic and shaped by experience. But to what extent does conne...
In the attempt of making cheap, high quality motion synthesis systems, the research community is dev...
Much recent work has focused on biologically plausible variants of supervised learning algorithms. H...
Deep learning (DL) has been considered as a breakthrough technique in the field of artificial intell...
Trabajo presentado en la 84th Annual Meeting of the DPG and DPG Meeting of the Condensed Matter Sect...
Artificial intelligence (AI) has the ability of revolutionizing our lives and society in a radical w...
Systems neuroscience seeks explanations for how the brain implements a wide variety of perceptual, c...
Deep neural networks follow a pattern of connectivity that was loosely inspired by neurobiology. The...
Computational neuroscience is in the midst of constructing a new framework for understanding the bra...
The success of deep networks and recent industry involvement in brain-inspired computing is igniting...
In the field of machine learning, ‘deep-learning’ has become spectacularly successful very rapidly, ...
Weight-sharing is one of the pillars behind Convolutional Neural Networks and their successes. Howev...
Many of the recent advances in the field of artificial intelligence have been fueled by the highly s...
Energy-efficient learning and control are becoming increasingly crucial for robots that solve comple...
Researchers have proposed that deep learning, which is providing important progress in a wide range ...
Synapses and neural connectivity are plastic and shaped by experience. But to what extent does conne...
In the attempt of making cheap, high quality motion synthesis systems, the research community is dev...
Much recent work has focused on biologically plausible variants of supervised learning algorithms. H...
Deep learning (DL) has been considered as a breakthrough technique in the field of artificial intell...
Trabajo presentado en la 84th Annual Meeting of the DPG and DPG Meeting of the Condensed Matter Sect...
Artificial intelligence (AI) has the ability of revolutionizing our lives and society in a radical w...
Systems neuroscience seeks explanations for how the brain implements a wide variety of perceptual, c...
Deep neural networks follow a pattern of connectivity that was loosely inspired by neurobiology. The...
Computational neuroscience is in the midst of constructing a new framework for understanding the bra...