Associative memories with recurrent connectivity can be built from networks of perceptrons and trained using the perceptron learning rule. Recent work has shown how a perceptron can be diluted, so that incoming connections are removed to produce an optimal perceptron. Such a perceptron classifies its training set with optimal margin and minimal connectivity. Here this technique is applied to perceptrons in associative memory networks. The main result shows that, by using these dilution methods, effective associative memories can be built with very sparse connectivity. Keywords: Associative memory, Hopfield network, optimal dilution. 1
Abstract The consequences of two techniques for symmetrically diluting the weights of the standard H...
We propose a genetic algorithm for mutually connected neural networks to obtain a higher capacity of...
Various algorithms for constructing weight matrices for Hopfield-type associative memories are revie...
Abstract: Associative memories with recurrent connectivity can be built from networks of perceptrons...
The consequences of diluting the weights of the standard Hopfield architecture associative memory mo...
High capacity associative memory models with dilute structured connectivity are trained using natura...
In this paper we present a modification of the strongly diluted Hopfield model in which the dilution...
In this paper we present a modification of the strongly diluted Hopfield model in which the dilution...
Abstract: High capacity associative neural networks can be built from networks of perceptrons, trai...
In this paper we present a modification of the strongly diluted Hopfield model in which the dilution...
Copyright SpringerThe consequences of two techniques for symmetrically diluting the weights of the s...
Abstract The consequences of two techniques for symmetrically diluting the weights of the standard H...
A new learning scheme is proposed for neural network architectures like the Hopfield network and bid...
A new learning scheme is proposed for neural network architectures like the Hopfield network and bid...
Abstract The consequences of two techniques for symmetrically diluting the weights of the standard H...
Abstract The consequences of two techniques for symmetrically diluting the weights of the standard H...
We propose a genetic algorithm for mutually connected neural networks to obtain a higher capacity of...
Various algorithms for constructing weight matrices for Hopfield-type associative memories are revie...
Abstract: Associative memories with recurrent connectivity can be built from networks of perceptrons...
The consequences of diluting the weights of the standard Hopfield architecture associative memory mo...
High capacity associative memory models with dilute structured connectivity are trained using natura...
In this paper we present a modification of the strongly diluted Hopfield model in which the dilution...
In this paper we present a modification of the strongly diluted Hopfield model in which the dilution...
Abstract: High capacity associative neural networks can be built from networks of perceptrons, trai...
In this paper we present a modification of the strongly diluted Hopfield model in which the dilution...
Copyright SpringerThe consequences of two techniques for symmetrically diluting the weights of the s...
Abstract The consequences of two techniques for symmetrically diluting the weights of the standard H...
A new learning scheme is proposed for neural network architectures like the Hopfield network and bid...
A new learning scheme is proposed for neural network architectures like the Hopfield network and bid...
Abstract The consequences of two techniques for symmetrically diluting the weights of the standard H...
Abstract The consequences of two techniques for symmetrically diluting the weights of the standard H...
We propose a genetic algorithm for mutually connected neural networks to obtain a higher capacity of...
Various algorithms for constructing weight matrices for Hopfield-type associative memories are revie...