In this work, we propose a dissipativity-based method for Lipschitz constant estimation of 1D convolutional neural networks (CNNs). In particular, we analyze the dissipativity properties of convolutional, pooling, and fully connected layers making use of incremental quadratic constraints for nonlinear activation functions and pooling operations. The Lipschitz constant of the concatenation of these mappings is then estimated by solving a semidefinite program which we derive from dissipativity theory. To make our method as efficient as possible, we take the structure of convolutional layers into account realizing these finite impulse response filters as causal dynamical systems in state space and carrying out the dissipativity analysis for th...
The robustness of neural networks can be quantitatively indicated by a lower bound within which any ...
36 pages, 17 figures, NEURIPS 2022 : Thirty-sixth Conference on Neural Information Processing System...
Nonlinearity causes information loss. The phase retrieval problem, or the phaseless reconstruction p...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bo...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
We investigate robustness of deep feed-forward neural networks when input data are subject to random...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep lear...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
International audienceBuilding nonexpansive Convolutional Neural Networks (CNNs) is a challenging pr...
International audienceThe stability of neural networks with respect to adversarial perturbations has...
International audienceWe investigate the robustness of feed-forward neural networks when input data ...
In recent times, we have seen a surge in usage of Convolutional Neural Networks to solve all kinds o...
The robustness of neural networks can be quantitatively indicated by a lower bound within which any ...
36 pages, 17 figures, NEURIPS 2022 : Thirty-sixth Conference on Neural Information Processing System...
Nonlinearity causes information loss. The phase retrieval problem, or the phaseless reconstruction p...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bo...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
We investigate robustness of deep feed-forward neural networks when input data are subject to random...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep lear...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
International audienceBuilding nonexpansive Convolutional Neural Networks (CNNs) is a challenging pr...
International audienceThe stability of neural networks with respect to adversarial perturbations has...
International audienceWe investigate the robustness of feed-forward neural networks when input data ...
In recent times, we have seen a surge in usage of Convolutional Neural Networks to solve all kinds o...
The robustness of neural networks can be quantitatively indicated by a lower bound within which any ...
36 pages, 17 figures, NEURIPS 2022 : Thirty-sixth Conference on Neural Information Processing System...
Nonlinearity causes information loss. The phase retrieval problem, or the phaseless reconstruction p...