Neural networks (NNs) have been used in several areas, showing their potential but also their limitations. One of the main limitations is the long time required for the training process; this is not useful in the case of a fast training process being required to respond to changes in the application domain. A possible way to accelerate the learning process of an NN is to implement it in hardware, but due to the high cost and the reduced flexibility of the original central processing unit (CPU) implementation, this solution is often not chosen. Recently, the power of the graphic processing unit (GPU), on the market, has increased and it has started to be used in many applications. In particular, a kind of NN named radial basis function netwo...
Abstract. This work presents the implementation of Feedforward Multi-Layer Perceptron (FFMLP) Neural...
This thesis deals with the implementation of an application for artificial neural networks simulatio...
This paper makes two principal contributions. The first is that there appears to be no previous a de...
Radial Basis Function Neural Networks (RBFNN) are used in variety of applications such as pattern re...
This paper presents the work regarding the implementation of neural network using radial basis funct...
Abstract—Large scale artificial neural networks (ANNs) have been widely used in data processing appl...
A scalable and reconfigurable architecture for accelerating classification using Radial Basis Functi...
In this paper we present design and analysis of scalable hardware architectures for training learnin...
Abstract. This paper presents a parallel architecture for a radial basis function (RBF) neural netwo...
This paper presents a parallel architecture for a radial basis function (RBF) neural network used fo...
This paper presents a parallel architecture for a radial basis function (RBF) neural network used fo...
Graduation date: 2010We took the back-propagation algorithms of Werbos for recurrent and feed-forwar...
This paper presents a novel VLSI architecture for the training of radial basis function (RBF) networ...
Open-source deep learning tools has been distributed numerously and has gain popularity in the past ...
The article discusses possibilities of implementing a neural network in a parallel way. The issues o...
Abstract. This work presents the implementation of Feedforward Multi-Layer Perceptron (FFMLP) Neural...
This thesis deals with the implementation of an application for artificial neural networks simulatio...
This paper makes two principal contributions. The first is that there appears to be no previous a de...
Radial Basis Function Neural Networks (RBFNN) are used in variety of applications such as pattern re...
This paper presents the work regarding the implementation of neural network using radial basis funct...
Abstract—Large scale artificial neural networks (ANNs) have been widely used in data processing appl...
A scalable and reconfigurable architecture for accelerating classification using Radial Basis Functi...
In this paper we present design and analysis of scalable hardware architectures for training learnin...
Abstract. This paper presents a parallel architecture for a radial basis function (RBF) neural netwo...
This paper presents a parallel architecture for a radial basis function (RBF) neural network used fo...
This paper presents a parallel architecture for a radial basis function (RBF) neural network used fo...
Graduation date: 2010We took the back-propagation algorithms of Werbos for recurrent and feed-forwar...
This paper presents a novel VLSI architecture for the training of radial basis function (RBF) networ...
Open-source deep learning tools has been distributed numerously and has gain popularity in the past ...
The article discusses possibilities of implementing a neural network in a parallel way. The issues o...
Abstract. This work presents the implementation of Feedforward Multi-Layer Perceptron (FFMLP) Neural...
This thesis deals with the implementation of an application for artificial neural networks simulatio...
This paper makes two principal contributions. The first is that there appears to be no previous a de...