"Sparse" neural networks, in which relatively few neurons or connections are active, are common in both machine learning and neuroscience. Whereas in machine learning, "sparseness" is related to a penalty term which effectively leads to some connecting weights becoming small or zero, in biological brains, sparseness is often created when high spiking thresholds prevent neuronal activity. Inspired by neuroscience, here we introduce sparseness into a reservoir computing network via neuron-specific learnable thresholds of activity, allowing neurons with low thresholds to give output but silencing outputs from neurons with high thresholds. This approach, which we term "SpaRCe", optimises the sparseness level of the reservoir and applies the thr...
Animal nervous systems are highly efficient in processing sensory input. The neuromorphic computing ...
peer reviewedThrough the success of deep learning in various domains, artificial neural networks are...
Reservoir Computing (RC) is a well-known strategy for designing Recurrent Neural Networks featured b...
Recently, sparse training methods have started to be established as a de facto approach for training...
© 2020 Elsevier Ltd. All rights reserved. This manuscript is licensed under the Creative Commons Att...
The mushroom body is the key network for the representation of learned olfactory stimuli in Drosophi...
Growing evidence indicates that only a sparse subset from a pool of sensory neurons is active for th...
This work investigates Sparse Neural Networks, which are artificial neural information processing sy...
It is believed that energy efficiency is an important constraint in brain evolution. As synaptic tra...
Through the success of deep learning in various domains, artificial neural networks are currently am...
A combination of experimental and theoretical studies have postulated converging evidence for the hy...
Modern Machine learning techniques take advantage of the exponentially rising calculation power in n...
The training of sparse neural networks is becoming an increasingly important tool for reducing the ...
http://deepblue.lib.umich.edu/bitstream/2027.42/112572/1/12868_2012_Article_2540.pd
Animal nervous systems are highly efficient in processing sensory input. The neuromorphic computing ...
peer reviewedThrough the success of deep learning in various domains, artificial neural networks are...
Reservoir Computing (RC) is a well-known strategy for designing Recurrent Neural Networks featured b...
Recently, sparse training methods have started to be established as a de facto approach for training...
© 2020 Elsevier Ltd. All rights reserved. This manuscript is licensed under the Creative Commons Att...
The mushroom body is the key network for the representation of learned olfactory stimuli in Drosophi...
Growing evidence indicates that only a sparse subset from a pool of sensory neurons is active for th...
This work investigates Sparse Neural Networks, which are artificial neural information processing sy...
It is believed that energy efficiency is an important constraint in brain evolution. As synaptic tra...
Through the success of deep learning in various domains, artificial neural networks are currently am...
A combination of experimental and theoretical studies have postulated converging evidence for the hy...
Modern Machine learning techniques take advantage of the exponentially rising calculation power in n...
The training of sparse neural networks is becoming an increasingly important tool for reducing the ...
http://deepblue.lib.umich.edu/bitstream/2027.42/112572/1/12868_2012_Article_2540.pd
Animal nervous systems are highly efficient in processing sensory input. The neuromorphic computing ...
peer reviewedThrough the success of deep learning in various domains, artificial neural networks are...
Reservoir Computing (RC) is a well-known strategy for designing Recurrent Neural Networks featured b...