Brain-inspired event-based processors have attracted considerable attention for edge deployment because of their ability to efficiently process Convolutional Neural Networks (CNNs) by exploiting sparsity. On such processors, one critical feature is that the speed and energy consumption of CNN inference are approximately proportional to the number of non-zero values in the activation maps. Thus, to achieve top performance, an efficient training algorithm is required to largely suppress the activations in CNNs. We propose a novel training method, called Adaptive-Regularization Training Schedule (ARTS), which dramatically decreases the non-zero activations in a model by adaptively altering the regularization coefficient through training. We ev...
Neural networks are more expressive when they have multiple layers. In turn, conventional training m...
We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which empl...
Deep neural networks include millions of learnable parameters, making their deployment over resource...
Brain-inspired event-based processors have attracted considerable attention for edge deployment beca...
Brain-inspired event-based processors have attracted considerable attention for edge deployment beca...
Brain-inspired event-driven processors execute deep neural networks (DNNs) in a sparsity-aware manne...
In the recent past, real-time video processing using state-of-the-art deep neural networks (DNN) has...
Inspired by the adaptation phenomenon of neuronal firing, we propose the regularity normalization (R...
The large capacity of neural networks enables them to learn complex functions. To avoid overfitting,...
Neural network training is computationally and memory intensive. Sparse training can reduce the bur...
We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which empl...
Deep neural network models are commonly used in various real-life applications due to their high pre...
The undeniable computational power of artificial neural networks has granted the scientific communit...
Convolutional neural networks (CNNs) outperform traditional machine learning algorithms across a wid...
Deep learning has dramatically improved performance in various image analysis applications in the la...
Neural networks are more expressive when they have multiple layers. In turn, conventional training m...
We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which empl...
Deep neural networks include millions of learnable parameters, making their deployment over resource...
Brain-inspired event-based processors have attracted considerable attention for edge deployment beca...
Brain-inspired event-based processors have attracted considerable attention for edge deployment beca...
Brain-inspired event-driven processors execute deep neural networks (DNNs) in a sparsity-aware manne...
In the recent past, real-time video processing using state-of-the-art deep neural networks (DNN) has...
Inspired by the adaptation phenomenon of neuronal firing, we propose the regularity normalization (R...
The large capacity of neural networks enables them to learn complex functions. To avoid overfitting,...
Neural network training is computationally and memory intensive. Sparse training can reduce the bur...
We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which empl...
Deep neural network models are commonly used in various real-life applications due to their high pre...
The undeniable computational power of artificial neural networks has granted the scientific communit...
Convolutional neural networks (CNNs) outperform traditional machine learning algorithms across a wid...
Deep learning has dramatically improved performance in various image analysis applications in the la...
Neural networks are more expressive when they have multiple layers. In turn, conventional training m...
We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which empl...
Deep neural networks include millions of learnable parameters, making their deployment over resource...