The paper introduces a new approach to analyze the stability of neural network models without using any Lyapunov function. With the new approach, we investigate the stability properties of the general gradient-based neural network model for optimization problems. Our discussion includes both isolated equilibrium points and connected equilibrium sets which could be unbounded. For a general optimization problem, if the objective function is bounded below and its gradient is Lipschitz continuous, we prove that (a) any trajectory of the gradient-based neural network converges to an equilibrium point, and (b) the Lyapunov stability is equivalent to the asymptotical stability in the gradient-based neural networks. For a convex optimization proble...
This paper is concerned with neural networks which have the ability to solve linear and nonlinear co...
This paper deals with a class of large-scale nonlinear dynamical systems, namely the additive neural...
This paper is devoted to studying both the global and local stability of dynamical neural networks. ...
UNM Technical Report No. EECE93 001This report presents a formalism that enables the dynamics of a b...
The present paper shows that a su±cient condition for the existence of a stable solution to an autor...
In this paper, we present new conditions ensuring existence, uniqueness, and Global Asymptotic Stabi...
This brief studies the complete stability of neural networks with nonmonotonic piecewise linear acti...
This paper investigates the existence, uniqueness, and global exponential stability (GES) of the equ...
AbstractThis paper presents a neural network approach for solving convex programming problems with e...
This report presents a formalism that enables the dynamics of a broad class of neural networks to be...
AbstractIn this paper, by using the concept of differential equations with piecewise constant argume...
This paper discusses the stabilizability of arti®cial neural networks trained by utilizing the gradi...
This paper is divided into four parts. Part 1 contains a survey of three neural networks found in th...
This paper considers a class of neural networks (NNs) for solving linear programming (LP) problems, ...
This paper reviews a formalism that enables the dynamics of a broad class of neural networks to be u...
This paper is concerned with neural networks which have the ability to solve linear and nonlinear co...
This paper deals with a class of large-scale nonlinear dynamical systems, namely the additive neural...
This paper is devoted to studying both the global and local stability of dynamical neural networks. ...
UNM Technical Report No. EECE93 001This report presents a formalism that enables the dynamics of a b...
The present paper shows that a su±cient condition for the existence of a stable solution to an autor...
In this paper, we present new conditions ensuring existence, uniqueness, and Global Asymptotic Stabi...
This brief studies the complete stability of neural networks with nonmonotonic piecewise linear acti...
This paper investigates the existence, uniqueness, and global exponential stability (GES) of the equ...
AbstractThis paper presents a neural network approach for solving convex programming problems with e...
This report presents a formalism that enables the dynamics of a broad class of neural networks to be...
AbstractIn this paper, by using the concept of differential equations with piecewise constant argume...
This paper discusses the stabilizability of arti®cial neural networks trained by utilizing the gradi...
This paper is divided into four parts. Part 1 contains a survey of three neural networks found in th...
This paper considers a class of neural networks (NNs) for solving linear programming (LP) problems, ...
This paper reviews a formalism that enables the dynamics of a broad class of neural networks to be u...
This paper is concerned with neural networks which have the ability to solve linear and nonlinear co...
This paper deals with a class of large-scale nonlinear dynamical systems, namely the additive neural...
This paper is devoted to studying both the global and local stability of dynamical neural networks. ...