Long Short-Term Memory (LSTM) has achieved state-of-the-art performances on a wide range of tasks. Its outstanding performance is guaranteed by the long-term memory ability which matches the sequential data perfectly and the gating structure controlling the information flow. However, LSTMs are prone to be memory-bandwidth limited in realistic applications and need an unbearable period of training and inference time as the model size is ever-increasing. To tackle this problem, various efficient model compression methods have been proposed. Most of them need a big and expensive pre-trained model which is a nightmare for resource-limited devices where the memory budget is strictly limited. To remedy this situation, in this paper, we incorporat...
We investigate online nonlinear regression and introduce novel regression structures based on the lo...
Hardware accelerators for neural network inference can exploit common data properties for performanc...
Artificial Neural Networks (ANNs) have emerged as hot topics in the research community. Despite the ...
Long Short-Term Memory (LSTM) has achieved state-of-the-art performances on a wide range of tasks. I...
The iterative hard-thresholding algorithm (ISTA) is one of the most popular optimization solvers to ...
Copyright © 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
Recurrent neural networks (RNNs) have achieved state-of-the-art performances on various applications...
Long short-term memory (LSTM) has transformed both machine learning and neurocomputing fields. Accor...
Long Short-Term Memory (LSTM) is a type of Recurrent Neural Network (RNN) that is designed to handle...
Large neural networks are very successful in various tasks. However, with limited data, the generali...
One of the most popular approaches for neural network compression is sparsification — learning spars...
Training sparse neural networks with adaptive connectivity is an active research topic. Such network...
The strength of long short-term memory neural networks (LSTMs) that have been applied is more locate...
Time series prediction can be generalized as a process that extracts useful information from histori...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
We investigate online nonlinear regression and introduce novel regression structures based on the lo...
Hardware accelerators for neural network inference can exploit common data properties for performanc...
Artificial Neural Networks (ANNs) have emerged as hot topics in the research community. Despite the ...
Long Short-Term Memory (LSTM) has achieved state-of-the-art performances on a wide range of tasks. I...
The iterative hard-thresholding algorithm (ISTA) is one of the most popular optimization solvers to ...
Copyright © 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
Recurrent neural networks (RNNs) have achieved state-of-the-art performances on various applications...
Long short-term memory (LSTM) has transformed both machine learning and neurocomputing fields. Accor...
Long Short-Term Memory (LSTM) is a type of Recurrent Neural Network (RNN) that is designed to handle...
Large neural networks are very successful in various tasks. However, with limited data, the generali...
One of the most popular approaches for neural network compression is sparsification — learning spars...
Training sparse neural networks with adaptive connectivity is an active research topic. Such network...
The strength of long short-term memory neural networks (LSTMs) that have been applied is more locate...
Time series prediction can be generalized as a process that extracts useful information from histori...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
We investigate online nonlinear regression and introduce novel regression structures based on the lo...
Hardware accelerators for neural network inference can exploit common data properties for performanc...
Artificial Neural Networks (ANNs) have emerged as hot topics in the research community. Despite the ...