This work is based on a logical neuron model without weights, the Random Access Memory [1]. For the reason of memory consumption, RAMs cannot be completely connected in a bigger network, but the structure of connectivity is decomposed into n--tuple connections which causes additional properties of generalization [2]. Here a learning procedure for a feed-forward RAM network is presented which allows the neurons to exchange their input connections to resolve target conflicts in their function and thereby optimize internal representations. 1 Reduction of RAM Sizes in Layers of Neurons A RAM neuron can be trained by merely entering the desired responses into the memory locations addressed by its input (see RAM in Fig. 3). But difficulties ari...
We developed an identification method to find the strength of the connections between neurons from t...
Humans are able to form internal representations of the information they process – a capability wh...
Artificial neural networks, taking inspiration from biological neurons, have become an invaluable to...
This paper presents a survey of a class of neural models known as Weightless Neural Networks (WNNs)....
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
Rumelhart, Hinton and Williams [Rumelhart et al. 86] describe a learning procedure for layered netwo...
A new architecture for networks of RAM-based Boolean neurons is presented which, whilst retaining le...
Abstract. Random Access Memory (RAM) nodes can play the role of artificial neurons that are addresse...
The Problem: How can a distributed system of independent processors, armed with local communication ...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
I extend the class of exactly solvable feed-forward neural networks discussed in a previous publicat...
Abstract: To reduce random access memory (RAM) requirements and to increase speed of recognition alg...
Network training algorithms have heavily concentrated on the learning of connection weights. Little ...
The neurons are structured in layers and connections are drawn only from the previous layer to the n...
Two algorithms have recently been reported for training multi-layer networks of neurons with Heavisi...
We developed an identification method to find the strength of the connections between neurons from t...
Humans are able to form internal representations of the information they process – a capability wh...
Artificial neural networks, taking inspiration from biological neurons, have become an invaluable to...
This paper presents a survey of a class of neural models known as Weightless Neural Networks (WNNs)....
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
Rumelhart, Hinton and Williams [Rumelhart et al. 86] describe a learning procedure for layered netwo...
A new architecture for networks of RAM-based Boolean neurons is presented which, whilst retaining le...
Abstract. Random Access Memory (RAM) nodes can play the role of artificial neurons that are addresse...
The Problem: How can a distributed system of independent processors, armed with local communication ...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
I extend the class of exactly solvable feed-forward neural networks discussed in a previous publicat...
Abstract: To reduce random access memory (RAM) requirements and to increase speed of recognition alg...
Network training algorithms have heavily concentrated on the learning of connection weights. Little ...
The neurons are structured in layers and connections are drawn only from the previous layer to the n...
Two algorithms have recently been reported for training multi-layer networks of neurons with Heavisi...
We developed an identification method to find the strength of the connections between neurons from t...
Humans are able to form internal representations of the information they process – a capability wh...
Artificial neural networks, taking inspiration from biological neurons, have become an invaluable to...