We study the structure of multistable recurrent neural networks. The activation function is simplified by a nonsmooth Heaviside step function. This nonlinearity partitions the phase space into regions with different, yet linear dynamics. We derive how multistability is encoded within the network architecture. Stable states are identified by their semipositivity constraints on the synaptic weight matrix. The restrictions can be separated by their effects on the signs or the strengths of the connections. Exact results on network topology, sign stability, weight matrix factorization, pattern completion and pattern coupling are derived and proven. These may lay the foundation of more complex recurrent neural networks and neurocomputing.Comment:...
The remarkable properties of information-processing by biological and artificial neuronal networks a...
This is the author accepted manuscript. The final version is available from Springer via the DOI in ...
Winner-Take-All (WTA) networks are recurrently connected populations of excitatory and inhibitory ne...
Neural computation in biological and artificial networks relies on the nonlinear summation of many i...
<p>(A) An exemplary recurrent neural network of 12 neurons. The network state has a 4-Winner-Take-A...
Recurrent neural networks have received much attention due to their nonlinear dynamic behavior. One ...
This paper introduces recurrent equilibrium networks (RENs), a new class of nonlinear dynamical mode...
The Recurrent Neural Networks (RNNs) represent an important class of bio-inspired learning machines ...
The computational abilities of recurrent networks of neurons with a linear activation function above...
Abstract—Multistable networks have attracted much interest in recent years, since multistability is ...
A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a...
Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of local neural a...
The brain consists of many interconnected networks with time-varying, partially autonomous activity....
Oscillations arise in many real-world systems and are associated with both functional and dysfunctio...
Changes in behavioral state, such as arousal and movements, strongly affect neural activity in senso...
The remarkable properties of information-processing by biological and artificial neuronal networks a...
This is the author accepted manuscript. The final version is available from Springer via the DOI in ...
Winner-Take-All (WTA) networks are recurrently connected populations of excitatory and inhibitory ne...
Neural computation in biological and artificial networks relies on the nonlinear summation of many i...
<p>(A) An exemplary recurrent neural network of 12 neurons. The network state has a 4-Winner-Take-A...
Recurrent neural networks have received much attention due to their nonlinear dynamic behavior. One ...
This paper introduces recurrent equilibrium networks (RENs), a new class of nonlinear dynamical mode...
The Recurrent Neural Networks (RNNs) represent an important class of bio-inspired learning machines ...
The computational abilities of recurrent networks of neurons with a linear activation function above...
Abstract—Multistable networks have attracted much interest in recent years, since multistability is ...
A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a...
Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of local neural a...
The brain consists of many interconnected networks with time-varying, partially autonomous activity....
Oscillations arise in many real-world systems and are associated with both functional and dysfunctio...
Changes in behavioral state, such as arousal and movements, strongly affect neural activity in senso...
The remarkable properties of information-processing by biological and artificial neuronal networks a...
This is the author accepted manuscript. The final version is available from Springer via the DOI in ...
Winner-Take-All (WTA) networks are recurrently connected populations of excitatory and inhibitory ne...