International audienceThe stability of neural networks with respect to adversarial perturbations has been extensively studied. One of the main strategies consist of quantifying the Lipschitz regularity of neural networks. In this paper, we introduce a multivariate Lipschitz constant-based stability analysis of fully connected neural networks allowing us to capture the influence of each input or group of inputs on the neural network stability. Our approach relies on a suitable re-normalization of the input space, with the objective to perform a more precise analysis than the one provided by a global Lipschitz constant. We investigate the mathematical properties of the proposed multivariate Lipschitz analysis and show its usefulness in better...
We present a domain-theoretic framework for validated robustness analysis of neural networks. We fir...
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bo...
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep lear...
International audienceThe stability of neural networks with respect to adversarial perturbations has...
International audienceWe investigate the robustness of feed-forward neural networks when input data ...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
International audienceThis paper presents a quantitative approach to demonstrate the robustness of n...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
We investigate robustness of deep feed-forward neural networks when input data are subject to random...
International audienceStability of a machine learning model is the extent to which a model can conti...
A domain-theoretic framework is presented for validated robustness analysis of neural networks. Firs...
IEEE Neural networks (NNs) are now routinely implemented on systems that must operate in uncertain e...
Neural networks are currently used in a large number of applications, that is why the following ques...
Recent work has shown that state-of-the-art classifiers are quite brittle, in the sense that a small...
We present a domain-theoretic framework for validated robustness analysis of neural networks. We fir...
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bo...
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep lear...
International audienceThe stability of neural networks with respect to adversarial perturbations has...
International audienceWe investigate the robustness of feed-forward neural networks when input data ...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
International audienceThis paper presents a quantitative approach to demonstrate the robustness of n...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
We investigate robustness of deep feed-forward neural networks when input data are subject to random...
International audienceStability of a machine learning model is the extent to which a model can conti...
A domain-theoretic framework is presented for validated robustness analysis of neural networks. Firs...
IEEE Neural networks (NNs) are now routinely implemented on systems that must operate in uncertain e...
Neural networks are currently used in a large number of applications, that is why the following ques...
Recent work has shown that state-of-the-art classifiers are quite brittle, in the sense that a small...
We present a domain-theoretic framework for validated robustness analysis of neural networks. We fir...
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bo...
Fast and precise Lipschitz constant estimation of neural networks is an important task for deep lear...