18 pages, 10 figures, 2 tablesInternational audienceThe Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method provides a trade-off with respect to state of the art linear programming approach, a...
International audienceGiven a training set, a loss function, and a neural network architecture, it i...
This work addresses the sequential optimization of an unknown and potentially nonconvex function ove...
We contribute to a better understanding of the class of functions that is represented by a neural ne...
18 pages, 10 figures, 2 tablesInternational audienceThe Lipschitz constant of a network plays an imp...
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bo...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
The motivation for this work is to improve the performance of deep neural networks through the optim...
16 pages, 4 tables, 2 figuresInternational audienceDeep equilibrium models are based on implicitly d...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
39 pages, 15 tablesWe consider polynomial optimization problems (POP) on a semialgebraic set contain...
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep lear...
This paper investigates the existence, uniqueness, and global exponential stability (GES) of the equ...
International audienceGiven a training set, a loss function, and a neural network architecture, it i...
This work addresses the sequential optimization of an unknown and potentially nonconvex function ove...
We contribute to a better understanding of the class of functions that is represented by a neural ne...
18 pages, 10 figures, 2 tablesInternational audienceThe Lipschitz constant of a network plays an imp...
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bo...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
The motivation for this work is to improve the performance of deep neural networks through the optim...
16 pages, 4 tables, 2 figuresInternational audienceDeep equilibrium models are based on implicitly d...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
39 pages, 15 tablesWe consider polynomial optimization problems (POP) on a semialgebraic set contain...
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep lear...
This paper investigates the existence, uniqueness, and global exponential stability (GES) of the equ...
International audienceGiven a training set, a loss function, and a neural network architecture, it i...
This work addresses the sequential optimization of an unknown and potentially nonconvex function ove...
We contribute to a better understanding of the class of functions that is represented by a neural ne...