Owing to the growth of the size of convolutional neural networks (CNNs), quantization and loop tiling (also called loop breaking) are mandatory to implement CNN on an embedded system. However, channel loop tiling of quantized CNNs induces unexpected errors. We explain why channel loop tiling of quantized CNNs induces the unexpected errors, and how the errors affect the accuracy of state-of-the-art CNNs. We also propose a method to recover accuracy under channel tiling by compressing and decompressing the most-significant bits of partial sums. Using the proposed method, we can recover accuracy by 12.3% with only 1% circuit area overhead and an additional 2% of power consumption.1
International audienceIn this article, we propose a technique for improving the efficiency of convol...
DNNs have been finding a growing number of applications including image classification, speech recog...
This work presents CascadeCNN, an automated toolflow that pushes the quantisation limits of any give...
The heavy burdens of computation and off-chip traffic impede deploying the large scale convolution n...
Previous studies have demonstrated that, up to a certain degree, Convolutional Neural Networks (CNNs...
Some applications have the property of being resilient, meaning that they are robust to noise (e.g. ...
This article analyzes the effects of approximate multiplication when performing inferences on deep c...
Convolutional neural networks (CNN) are state of the art machine learning models used for various co...
Machine Learning (ML) has become a vital part of our world as Convolutional Neural Networks (CNN) en...
A convolution neural network (CNN) is a type of neural network commonly used to analyze visual image...
This paper presents a deep learning approach which evaluates accuracy and inference time speedups in...
The entangled guardbands in terms of timing specification and energy budget ensure a system against ...
Convolutional neural networks (CNNs) are becoming more and more important for solving challenging an...
The 40th SGAI International Conference on Artificial Intelligence (AI-2020), Cambridge, United Kingd...
Deep convolutional neural network (CNN) algorithms have emerged as a powerful tool for many computer...
International audienceIn this article, we propose a technique for improving the efficiency of convol...
DNNs have been finding a growing number of applications including image classification, speech recog...
This work presents CascadeCNN, an automated toolflow that pushes the quantisation limits of any give...
The heavy burdens of computation and off-chip traffic impede deploying the large scale convolution n...
Previous studies have demonstrated that, up to a certain degree, Convolutional Neural Networks (CNNs...
Some applications have the property of being resilient, meaning that they are robust to noise (e.g. ...
This article analyzes the effects of approximate multiplication when performing inferences on deep c...
Convolutional neural networks (CNN) are state of the art machine learning models used for various co...
Machine Learning (ML) has become a vital part of our world as Convolutional Neural Networks (CNN) en...
A convolution neural network (CNN) is a type of neural network commonly used to analyze visual image...
This paper presents a deep learning approach which evaluates accuracy and inference time speedups in...
The entangled guardbands in terms of timing specification and energy budget ensure a system against ...
Convolutional neural networks (CNNs) are becoming more and more important for solving challenging an...
The 40th SGAI International Conference on Artificial Intelligence (AI-2020), Cambridge, United Kingd...
Deep convolutional neural network (CNN) algorithms have emerged as a powerful tool for many computer...
International audienceIn this article, we propose a technique for improving the efficiency of convol...
DNNs have been finding a growing number of applications including image classification, speech recog...
This work presents CascadeCNN, an automated toolflow that pushes the quantisation limits of any give...