The Jacobian matrix (or the gradient for single-output networks) is directly related to many important properties of neural networks, such as the function landscape, stationary points, (local) Lipschitz constants and robustness to adversarial attacks. In this paper, we propose a recursive algorithm, RecurJac, to compute both upper and lower bounds for each element in the Jacobian matrix of a neural network with respect to network’s input, and the network can contain a wide range of activation functions. As a byproduct, we can efficiently obtain a (local) Lipschitz constant, which plays a crucial role in neural network robustness verification, as well as the training stability of GANs. Experiments show that (local) Lipschitz constants produc...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
We pursue a line of research that seeks to regularize the spectral norm of the Jacobian of the input...
The authors address the problem of choosing synaptic weights in a recursive (Hopfield) neural networ...
© 2018 Curran Associates Inc..All rights reserved. Finding minimum distortion of adversarial example...
International audienceThe stability of neural networks with respect to adversarial perturbations has...
We can compare the expressiveness of neural networks that use rectified linear units (ReLUs) by the ...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
International audienceLearning expressive probabilistic models correctly describing the data is a ub...
The desire to provide robust guarantees on neural networks has never been more important, as their p...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
From an analytical approach of the multilayer network architecture, we deduce a polynomial-time alg...
We present a domain-theoretic framework for validated robustness analysis of neural networks. We fir...
A domain-theoretic framework is presented for validated robustness analysis of neural networks. Firs...
A recent line of work has analyzed the theoretical properties of deep neural networks via the Neural...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
We pursue a line of research that seeks to regularize the spectral norm of the Jacobian of the input...
The authors address the problem of choosing synaptic weights in a recursive (Hopfield) neural networ...
© 2018 Curran Associates Inc..All rights reserved. Finding minimum distortion of adversarial example...
International audienceThe stability of neural networks with respect to adversarial perturbations has...
We can compare the expressiveness of neural networks that use rectified linear units (ReLUs) by the ...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
International audienceLearning expressive probabilistic models correctly describing the data is a ub...
The desire to provide robust guarantees on neural networks has never been more important, as their p...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
From an analytical approach of the multilayer network architecture, we deduce a polynomial-time alg...
We present a domain-theoretic framework for validated robustness analysis of neural networks. We fir...
A domain-theoretic framework is presented for validated robustness analysis of neural networks. Firs...
A recent line of work has analyzed the theoretical properties of deep neural networks via the Neural...
International audienceObtaining sharp Lipschitz constants for feed-forward neural networks is essent...
We pursue a line of research that seeks to regularize the spectral norm of the Jacobian of the input...
The authors address the problem of choosing synaptic weights in a recursive (Hopfield) neural networ...