Abstract. In the present paper, the neural networks theory based on presumptions of the Ising model is considered. Indirect couplings, the Dirac distributions and the corrected Hebb rule are introduced and analyzed. The embedded patterns memorized in a neural network and the indirect couplings are considered as random. Apart from the complex theory based on Dirac distributions the simplified stationary mean field equations and their solutions taking into account an ergodicity of the average overlap and the indirect order parameter are presented. The modeling results are demonstrated to corroborate theoretical statements and applied aspects
We survey the statistical mechanics approach to the analysis of neural networks of the Hopfield type...
This paper provides the complete illustration about the observation of new group of distributive mem...
The question of the nature of the distributed memory of neural networks is considered. Since the mem...
Abstract: Neural networks are nowadays both powerful operational tools (e.g., for pattern recognitio...
AbstractIn this paper, we study a neural network. In this set, each neuron is characterized by its n...
Abstract. The typical fraction of the space of interactions between each pair of N Ising spins which...
Abstract. The more realistic neural soma and synaptic nonlinear relations and an alternative mean fi...
The macroscopic dynamics of an extremely diluted as well as of a fully connected three-state neural ...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
A general mean-field theory is presented for an attractor neural network in which each elementary un...
Neural networks with symmetric couplings which have an intermediate form between the Hebb learning r...
Cette contribution présente le formalisme et quelques résultats importants spécifiques de l'étude de...
A summary is presented of the statistical mechanical theory of learning a rule with a neural network...
ArticleWe present a mathematical analysis of the effects of Hebbian learning in random recurrent neu...
Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mi...
We survey the statistical mechanics approach to the analysis of neural networks of the Hopfield type...
This paper provides the complete illustration about the observation of new group of distributive mem...
The question of the nature of the distributed memory of neural networks is considered. Since the mem...
Abstract: Neural networks are nowadays both powerful operational tools (e.g., for pattern recognitio...
AbstractIn this paper, we study a neural network. In this set, each neuron is characterized by its n...
Abstract. The typical fraction of the space of interactions between each pair of N Ising spins which...
Abstract. The more realistic neural soma and synaptic nonlinear relations and an alternative mean fi...
The macroscopic dynamics of an extremely diluted as well as of a fully connected three-state neural ...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
A general mean-field theory is presented for an attractor neural network in which each elementary un...
Neural networks with symmetric couplings which have an intermediate form between the Hebb learning r...
Cette contribution présente le formalisme et quelques résultats importants spécifiques de l'étude de...
A summary is presented of the statistical mechanical theory of learning a rule with a neural network...
ArticleWe present a mathematical analysis of the effects of Hebbian learning in random recurrent neu...
Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mi...
We survey the statistical mechanics approach to the analysis of neural networks of the Hopfield type...
This paper provides the complete illustration about the observation of new group of distributive mem...
The question of the nature of the distributed memory of neural networks is considered. Since the mem...