A generalization of a class of neural network architectures based on a multiple quantization of input space combined with memory lookup operations is presented under the name of a general memory neural network (GMNN). Within this common framework it is shown that networks of this type are - for a variety of learning schemes - response-equivalent to basis function networks (i.e., radial basis function and kernel regression networks). In particular, this equivalence holds even if a GMNN does not employ explicit basis functions, which makes the architecture attractive from an implementational point of view and allows fast operation, both in the learning and response modes. Variants of the GMNN are discussed and examples of existing architectur...
This dissertation studies neural networks for pattern classification and universal approximation. Th...
<p>Architecture (a) corresponds to the algorithm of learning and recall as described in the text. In...
Neural networks (NN) have achieved great successes in pattern recognition and machine learning. Howe...
A generalization of a class of neural network architectures based on a multiple quantization of inpu...
A common framework for architectures combining multiple vector-quantization of the input space with ...
Abstract-This paper describes a memory-based network that provides estimates of continuous variables...
Abstract: Usually, generalization is considered as a function of learning from a set of examples. In...
Originally, artificial neural networks were built from biologically inspired units called perceptron...
Usually, generalization is considered as a function of learning from a set of examples. In present w...
A new architecture for networks of RAM-based Boolean neurons is presented which, whilst retaining le...
An overview of neural networks, covering multilayer perceptrons, radial basis functions, constructiv...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
This paper introduces a change in the structure of an artificial neuron (McCulloch and Pitts), to im...
Abstract. The paper presents the design of three types of neural networks with different features, i...
We study learning and generalisation ability of a specific two-layer feed-forward neural network and...
This dissertation studies neural networks for pattern classification and universal approximation. Th...
<p>Architecture (a) corresponds to the algorithm of learning and recall as described in the text. In...
Neural networks (NN) have achieved great successes in pattern recognition and machine learning. Howe...
A generalization of a class of neural network architectures based on a multiple quantization of inpu...
A common framework for architectures combining multiple vector-quantization of the input space with ...
Abstract-This paper describes a memory-based network that provides estimates of continuous variables...
Abstract: Usually, generalization is considered as a function of learning from a set of examples. In...
Originally, artificial neural networks were built from biologically inspired units called perceptron...
Usually, generalization is considered as a function of learning from a set of examples. In present w...
A new architecture for networks of RAM-based Boolean neurons is presented which, whilst retaining le...
An overview of neural networks, covering multilayer perceptrons, radial basis functions, constructiv...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
This paper introduces a change in the structure of an artificial neuron (McCulloch and Pitts), to im...
Abstract. The paper presents the design of three types of neural networks with different features, i...
We study learning and generalisation ability of a specific two-layer feed-forward neural network and...
This dissertation studies neural networks for pattern classification and universal approximation. Th...
<p>Architecture (a) corresponds to the algorithm of learning and recall as described in the text. In...
Neural networks (NN) have achieved great successes in pattern recognition and machine learning. Howe...