Critical questions in dynamical neuroscience and machine learning are related to the study of continuous-time neural networks and their stability, robustness, and computational efficiency. These properties can be simultaneously established via a contraction analysis. This paper develops a comprehensive non-Euclidean contraction theory for continuous-time neural networks. First, for non-Euclidean $\ell_{1}/\ell_{\infty}$ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights and closed-form worst-case expressions over certain matrix polytopes. Second, for locally Lipschitz maps (e.g., arising as activation functions), we show that their one-sided Lipschitz constant equals the essential supremum of the loga...
A non–linear dynamic system is called contracting if initial conditions are for-gotten exponentially...
AbstractThis paper studies the dynamics of a system of retarded functional differential equations (i...
This letter presents a new deep learning-based framework for robust nonlinear estimation and control...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of local neural a...
One of the central questions in neuroscience is how neurons and neuron populations communicate with ...
Deep neural networks can be fragile and sensitive to small input perturbations that might cause a si...
We provide a novel transcription of monotone operator theory to the non-Euclidean finite-dimensional...
Conditions for global asymptotic stability (GAS) of a nonlinear relaxation process realized by a rec...
A non–linear dynamic system is called contracting if initial conditions are forgotten exponentially ...
The brain consists of many interconnected networks with time-varying, partially autonomous activity....
A non–linear dynamic system is called contracting if initial conditions are for-gotten exponentially...
AbstractThis paper studies the dynamics of a system of retarded functional differential equations (i...
This letter presents a new deep learning-based framework for robust nonlinear estimation and control...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modelin...
Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of local neural a...
One of the central questions in neuroscience is how neurons and neuron populations communicate with ...
Deep neural networks can be fragile and sensitive to small input perturbations that might cause a si...
We provide a novel transcription of monotone operator theory to the non-Euclidean finite-dimensional...
Conditions for global asymptotic stability (GAS) of a nonlinear relaxation process realized by a rec...
A non–linear dynamic system is called contracting if initial conditions are forgotten exponentially ...
The brain consists of many interconnected networks with time-varying, partially autonomous activity....
A non–linear dynamic system is called contracting if initial conditions are for-gotten exponentially...
AbstractThis paper studies the dynamics of a system of retarded functional differential equations (i...
This letter presents a new deep learning-based framework for robust nonlinear estimation and control...