Brain-inspired event-based processors have attracted considerable attention for edge deployment because of their ability to efficiently process Convolutional Neural Networks (CNNs) by exploiting sparsity. On such processors, one critical feature is that the speed and energy consumption of CNN inference are approximately proportional to the number of non-zero values in the activation maps. Thus, to achieve top performance, an efficient training algorithm is required to largely suppress the activations in CNNs. We propose a novel training method, called Adaptive-Regularization Training Schedule (ARTS), which dramatically decreases the non-zero activations in a model by adaptively altering the regularization coefficient through training. We ev...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
Deep neural networks (DNN) are the state-of-the-art machine learning models outperforming traditiona...
Deep learning has achieved tremendous success on various computer vision tasks. However, deep learni...
Brain-inspired event-based processors have attracted considerable attention for edge deployment beca...
Brain-inspired event-driven processors execute deep neural networks (DNNs) in a sparsity-aware manne...
In the recent past, real-time video processing using state-of-the-art deep neural networks (DNN) has...
Deep Neural Networks (DNNs) have emerged as an important class of machine learning algorithms, provi...
Neural network training is computationally and memory intensive. Sparse training can reduce the bur...
Deep neural network models are commonly used in various real-life applications due to their high pre...
We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which empl...
Processing visual data on mobile devices has many applications, e.g., emergency response and trackin...
Through the success of deep learning in various domains, artificial neural networks are currently am...
We propose a novel method for training a neural network for image classification to reduce input dat...
The development of deep learning has led to a dramatic increase in the number of applications of art...
Inspired by the adaptation phenomenon of neuronal firing, we propose the regularity normalization (R...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
Deep neural networks (DNN) are the state-of-the-art machine learning models outperforming traditiona...
Deep learning has achieved tremendous success on various computer vision tasks. However, deep learni...
Brain-inspired event-based processors have attracted considerable attention for edge deployment beca...
Brain-inspired event-driven processors execute deep neural networks (DNNs) in a sparsity-aware manne...
In the recent past, real-time video processing using state-of-the-art deep neural networks (DNN) has...
Deep Neural Networks (DNNs) have emerged as an important class of machine learning algorithms, provi...
Neural network training is computationally and memory intensive. Sparse training can reduce the bur...
Deep neural network models are commonly used in various real-life applications due to their high pre...
We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which empl...
Processing visual data on mobile devices has many applications, e.g., emergency response and trackin...
Through the success of deep learning in various domains, artificial neural networks are currently am...
We propose a novel method for training a neural network for image classification to reduce input dat...
The development of deep learning has led to a dramatic increase in the number of applications of art...
Inspired by the adaptation phenomenon of neuronal firing, we propose the regularity normalization (R...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
Deep neural networks (DNN) are the state-of-the-art machine learning models outperforming traditiona...
Deep learning has achieved tremendous success on various computer vision tasks. However, deep learni...