Although research on the inference phase of edge artificial intelligence (AI) has made considerable improvement, the required training phase remains an unsolved problem. Neural network (NN) processing has two phases: inference and training. In the training phase, a NN incurs high calculation cost. The number of bits (bitwidth) in the training phase is several orders of magnitude larger than that in the inference phase. Training algorithms, optimized to software, are not appropriate for training hardware-oriented NNs. Therefore, we propose a new training algorithm for edge AI: backpropagation (BP) using a ternarized gradient. This ternarized backpropagation (TBP) provides a balance between calculation cost and performance. Empirical results ...
Deep Neural Network (DNN) models are now commonly used to automate and optimize complicated tasks in...
This paper presents some simple techniques to improve the backpropagation algorithm. Since learning ...
The architecture of an artificial neural network has a great impact on the generalization power. M...
The creation of effective computational models that function within the power limitations of edge de...
In this paper we explore different strategies to guide backpropagation algorithm used for training a...
Edge computing, which has been gaining attention in re-cent years, has many advantages, such as redu...
Backpropagation (BP)-based gradient descent is the general approach to train a neural network with a...
To enable learning on edge devices with fast convergence and low memory, we present a novel backprop...
The effects of silicon implementation on the backpropagation learning rule in artificial neural syst...
Edge intelligence systems, the intersection of edge computing and artificial intelligence (AI), are ...
This thesis investigates the possibility of porting a neural network model trained and modeled in Te...
The ever-growing computational demands of increasingly complex machine learning models frequently ne...
The capabilities of natural neural systems have inspired new generations of machine learning algorit...
On‐chip training of neural networks (NNs) is regarded as a promising training method for neuromorphi...
In this paper we explore different strategies to guide backpropagation algorithm used for training a...
Deep Neural Network (DNN) models are now commonly used to automate and optimize complicated tasks in...
This paper presents some simple techniques to improve the backpropagation algorithm. Since learning ...
The architecture of an artificial neural network has a great impact on the generalization power. M...
The creation of effective computational models that function within the power limitations of edge de...
In this paper we explore different strategies to guide backpropagation algorithm used for training a...
Edge computing, which has been gaining attention in re-cent years, has many advantages, such as redu...
Backpropagation (BP)-based gradient descent is the general approach to train a neural network with a...
To enable learning on edge devices with fast convergence and low memory, we present a novel backprop...
The effects of silicon implementation on the backpropagation learning rule in artificial neural syst...
Edge intelligence systems, the intersection of edge computing and artificial intelligence (AI), are ...
This thesis investigates the possibility of porting a neural network model trained and modeled in Te...
The ever-growing computational demands of increasingly complex machine learning models frequently ne...
The capabilities of natural neural systems have inspired new generations of machine learning algorit...
On‐chip training of neural networks (NNs) is regarded as a promising training method for neuromorphi...
In this paper we explore different strategies to guide backpropagation algorithm used for training a...
Deep Neural Network (DNN) models are now commonly used to automate and optimize complicated tasks in...
This paper presents some simple techniques to improve the backpropagation algorithm. Since learning ...
The architecture of an artificial neural network has a great impact on the generalization power. M...