We propose and develop an original model of associative memories relying on coded neural networks. Associative memories are devices able to learn messages then to retrieve them from part of their contents. The state-of-the-art model in terms of efficiency (ratio of the amount of bits learned to the amount of bits used) is the Hopfield Neural Network, whose learning diversity - the number of mes- sages it can store - is lower than n 2 log(n) where n is the number of neurons in the network. Our work consists of using error correcting coding and decoding techniques, more precisely distributed codes, to considerably increase the performance of associative memories. To achieve this, we introduce original codes whose code- words rely on neural cl...