When training neural networks for classification tasks with backpropagation, parameters are updated on every trial, even if the sample is classified correctly. In contrast, humans concentrate their learning effort on errors. Inspired by human learning, we introduce lazy learning, which only learns on incorrect samples. Lazy learning can be implemented in a few lines of code and requires no hy-perparameter tuning. Lazy learning achieves state-of-the-art performance and is particularly suited when datasets are large. For instance, it reaches 99.2% test accuracy on Extended MNIST using a single-layer MLP, and does so 7.6× faster than a matched backprop network. Recent progress in machine learning has been partly attributed to the use of large ...
Many aspects of the brain’s design can be understood as the result of evolutionary drive towards eff...
We show that a form of synaptic plasticity recently discovered in slices of the rat visual cortex (A...
Artificial neural networks in their various different forms convincingly dominate machine learning o...
When training neural networks for classification tasks with backpropagation, parameters are updated ...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
Lazy learning methods have been used to deal with problems in which the learning examples are not ev...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
Training deep neural networks with the error backpropagation algorithm is considered implausible fro...
The brain processes information through many layers of neurons. This deep architecture is representa...
The brain is not only constrained by energy needed to fuel computation, but it is also constrained b...
Among attempts at giving a theoretical account of the success of deep neural networks, a recent line...
The brain processes information through multiple layers of neurons. This deep architecture is repres...
Backpropagation, similar to most high-order learning algorithms, is prone to overfitting. We address...
The state-of-the art machine learning approach to training deep neural networks, backpropagation, is...
Much recent work has focused on biologically plausible variants of supervised learning algorithms. H...
Many aspects of the brain’s design can be understood as the result of evolutionary drive towards eff...
We show that a form of synaptic plasticity recently discovered in slices of the rat visual cortex (A...
Artificial neural networks in their various different forms convincingly dominate machine learning o...
When training neural networks for classification tasks with backpropagation, parameters are updated ...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
Lazy learning methods have been used to deal with problems in which the learning examples are not ev...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
Training deep neural networks with the error backpropagation algorithm is considered implausible fro...
The brain processes information through many layers of neurons. This deep architecture is representa...
The brain is not only constrained by energy needed to fuel computation, but it is also constrained b...
Among attempts at giving a theoretical account of the success of deep neural networks, a recent line...
The brain processes information through multiple layers of neurons. This deep architecture is repres...
Backpropagation, similar to most high-order learning algorithms, is prone to overfitting. We address...
The state-of-the art machine learning approach to training deep neural networks, backpropagation, is...
Much recent work has focused on biologically plausible variants of supervised learning algorithms. H...
Many aspects of the brain’s design can be understood as the result of evolutionary drive towards eff...
We show that a form of synaptic plasticity recently discovered in slices of the rat visual cortex (A...
Artificial neural networks in their various different forms convincingly dominate machine learning o...