We show how to construct a Lyapunov function for a discrete recurrent neural network using the variable-gradient method, This method can also be used to obtain the Hopfield energy function. Using our Lyapunov function, we compute an upper bound for the transient length for our neural network dynamics. We also show how our Lyapunov function can provide insights into the effect that the introduction of self-feedback weights to our neural network has on the sizes of the basins of attraction of the equilibrium points of the neural network state space
This paper deals with the stability of static recurrent neural networks (RNNs) with a time-varying d...
In this paper, some global exponential stability criteria for the equilibrium point of discrete-time...
The robust stability of uncertain discrete-time recurrent neural networks with time-varying delay is...
We consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Di...
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedfor...
The paper deals with a specific kind of discrete-time recurrent neural network designed with dynamic...
AbstractConvergence analysis of recurrent neural networks is an important research direction in the ...
Abstract: This paper establishes new delay-range-dependent, robust global stability for a class of d...
Part 11: Engineering Applications of AI and Artificial Neural NetworksInternational audienceIn this ...
AbstractIn this paper, we derive some new conditions for absolute exponential stability (AEST) of a ...
This paper studies the existence, uniqueness and global asymptotic stability of the equilibrium poin...
A new approach for the adaptive algorithm of a fully connected recurrent neural network (RNN) based ...
The study of deep Recurrent Neural Network (RNN) models represents a research topic of increasing in...
AbstractThis paper investigates the dynamics of a class of recurrent neural networks where the neura...
The dynamic preservation in discrete simulations of the recurrent neural networks (RNNs) with discre...
This paper deals with the stability of static recurrent neural networks (RNNs) with a time-varying d...
In this paper, some global exponential stability criteria for the equilibrium point of discrete-time...
The robust stability of uncertain discrete-time recurrent neural networks with time-varying delay is...
We consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Di...
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedfor...
The paper deals with a specific kind of discrete-time recurrent neural network designed with dynamic...
AbstractConvergence analysis of recurrent neural networks is an important research direction in the ...
Abstract: This paper establishes new delay-range-dependent, robust global stability for a class of d...
Part 11: Engineering Applications of AI and Artificial Neural NetworksInternational audienceIn this ...
AbstractIn this paper, we derive some new conditions for absolute exponential stability (AEST) of a ...
This paper studies the existence, uniqueness and global asymptotic stability of the equilibrium poin...
A new approach for the adaptive algorithm of a fully connected recurrent neural network (RNN) based ...
The study of deep Recurrent Neural Network (RNN) models represents a research topic of increasing in...
AbstractThis paper investigates the dynamics of a class of recurrent neural networks where the neura...
The dynamic preservation in discrete simulations of the recurrent neural networks (RNNs) with discre...
This paper deals with the stability of static recurrent neural networks (RNNs) with a time-varying d...
In this paper, some global exponential stability criteria for the equilibrium point of discrete-time...
The robust stability of uncertain discrete-time recurrent neural networks with time-varying delay is...