We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bounds on the Lipschitz constant of neural networks. The underlying optimization problems boil down to either linear (LP) or semidefinite (SDP) programming. We show how to use the sparse connectivity of a network, to significantly reduce the complexity of computation. This is specially useful for convolutional as well as pruned neural networks. We conduct experiments on networks with random weights as well as networks trained on MNIST, showing that in the particular case of the `1-Lipschitz constant, our approach yields superior estimates, compared to baselines available in the literature
State-of-the-art approaches for training Differentially Private (DP) Deep Neural Networks (DNN) face...
Abstract—It has been known for some years that the uniform-density problem for forward neural networ...
International audienceThe stability of neural networks with respect to adversarial perturbations has...
18 pages, 10 figures, 2 tablesInternational audienceThe Lipschitz constant of a network plays an imp...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
The motivation for this work is to improve the performance of deep neural networks through the optim...
Beside the minimizationof the prediction error, two of the most desirable properties of a regression...
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep lear...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
Private inference on neural networks requires running all the computation on encrypted data. Unfortu...
This paper introduces the first statistically consistent estimator of the optimal transport map betw...
In this work, we propose a dissipativity-based method for Lipschitz constant estimation of 1D convol...
State-of-the-art approaches for training Differentially Private (DP) Deep Neural Networks (DNN) face...
Abstract—It has been known for some years that the uniform-density problem for forward neural networ...
International audienceThe stability of neural networks with respect to adversarial perturbations has...
18 pages, 10 figures, 2 tablesInternational audienceThe Lipschitz constant of a network plays an imp...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
The motivation for this work is to improve the performance of deep neural networks through the optim...
Beside the minimizationof the prediction error, two of the most desirable properties of a regression...
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep lear...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
Private inference on neural networks requires running all the computation on encrypted data. Unfortu...
This paper introduces the first statistically consistent estimator of the optimal transport map betw...
In this work, we propose a dissipativity-based method for Lipschitz constant estimation of 1D convol...
State-of-the-art approaches for training Differentially Private (DP) Deep Neural Networks (DNN) face...
Abstract—It has been known for some years that the uniform-density problem for forward neural networ...
International audienceThe stability of neural networks with respect to adversarial perturbations has...