permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Abstract Theoretical models of associative memory generally assume most of their parameters to be homogeneous across the network. Conversely, biological neural net-works exhibit high variability of structural as well as activity parameters. In this pa-per, we extend the classical clipped learning rule by Willshaw to networks with inho-mogeneous sparseness, i.e., the number of active neurons may vary across memory items. We evaluate this learning rule for sequence memory networks with instanta-neous feedback inhibition and show that little surprisingly, memory capacity degrades with increased variability in sparseness. The lo...