We consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Discrete Time Recurrent Neural Networks. The standard and advanced criteria for Absolute Stability of these essentially nonlinear systems produce rather weak results. The method mentioned above is proved to be more powerful. It involves a multi-step procedure with maximization of special nonconvex functions over polytopes on every step. We derive conditions which guarantee an existence of at most one point of local maximum for such functions over every hyperplane. This nontrivial result is valid for wide range of neuron transfer functions
Global robust convergence properties of continuous-time neural networks with discrete delays are stu...
AbstractIn this paper, the global stability problem for a general discrete Cohen–Grossberg neural ne...
This paper investigates the global robust convergence properties of continuous-time neural networks ...
We consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Di...
The paper deals with a specific kind of discrete-time recurrent neural network designed with dynamic...
Abstract: This paper establishes new delay-range-dependent, robust global stability for a class of d...
The global asymptotic stability for discrete-time delayed recurrent neural networks is concerned. By...
Abstract. This paper proves a global stability result for a class of nonlinear discrete-time systems...
Global robust convergence properties of continuous-time neural networks with discrete delays are stu...
We show how to construct a Lyapunov function for a discrete recurrent neural network using the varia...
The study of deep Recurrent Neural Network (RNN) models represents a research topic of increasing in...
AbstractThis paper investigates the dynamics of a class of recurrent neural networks where the neura...
This paper is devoted to studying both the global and local stability of dynamical neural networks. ...
. We present conditions for absolute stability of recurrent neural networks with time-varying weight...
A novel sufficient condition is developed to obtain the discrete-time analogues of cellular neural n...
Global robust convergence properties of continuous-time neural networks with discrete delays are stu...
AbstractIn this paper, the global stability problem for a general discrete Cohen–Grossberg neural ne...
This paper investigates the global robust convergence properties of continuous-time neural networks ...
We consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Di...
The paper deals with a specific kind of discrete-time recurrent neural network designed with dynamic...
Abstract: This paper establishes new delay-range-dependent, robust global stability for a class of d...
The global asymptotic stability for discrete-time delayed recurrent neural networks is concerned. By...
Abstract. This paper proves a global stability result for a class of nonlinear discrete-time systems...
Global robust convergence properties of continuous-time neural networks with discrete delays are stu...
We show how to construct a Lyapunov function for a discrete recurrent neural network using the varia...
The study of deep Recurrent Neural Network (RNN) models represents a research topic of increasing in...
AbstractThis paper investigates the dynamics of a class of recurrent neural networks where the neura...
This paper is devoted to studying both the global and local stability of dynamical neural networks. ...
. We present conditions for absolute stability of recurrent neural networks with time-varying weight...
A novel sufficient condition is developed to obtain the discrete-time analogues of cellular neural n...
Global robust convergence properties of continuous-time neural networks with discrete delays are stu...
AbstractIn this paper, the global stability problem for a general discrete Cohen–Grossberg neural ne...
This paper investigates the global robust convergence properties of continuous-time neural networks ...