This paper is concerned with the training of neural networks (NNs) under semidefinite constraints, which allows for NN training with robustness and stability guarantees. In particular, we focus on Lipschitz bounds for NNs. Exploiting the banded structure of the underlying matrix constraint, we set up an efficient and scalable training scheme for NN training problems of this kind based on interior point methods. Our implementation allows to enforce Lipschitz constraints in the training of large-scale deep NNs such as Wasserstein generative adversarial networks (WGANs) via semidefinite constraints. In numerical examples, we show the superiority of our method and its applicability to WGAN training.Comment: to be published in 61st IEEE Conferen...
Exciting new work on the generalization bounds for neural networks (NN) givenby Neyshabur et al. ,...
We introduce an efficient and tight layer-based semidefinite relaxation for verifying local robust-n...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
We propose an enhanced semidefinite program (SDP) relaxation to enable the tight and efficient verif...
Despite being impactful on a variety of problems and applications, the generative adversarial nets (...
We propose an enhanced semidefinite program (SDP) relaxation to enable the tight and efficient verif...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
Since their invention, generative adversarial networks (GANs) have become a popular approach for lea...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
Stability certification and identifying a safe and stabilizing initial set are two important concern...
We introduce a novel method based on semidefinite program (SDP) for the tight and efficient verifica...
In this paper, we address the adversarial training of neural ODEs from a robust control perspective....
Many future technologies rely on neural networks, but verifying the correctness of their behavior re...
18 pages, 10 figures, 2 tablesInternational audienceThe Lipschitz constant of a network plays an imp...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
Exciting new work on the generalization bounds for neural networks (NN) givenby Neyshabur et al. ,...
We introduce an efficient and tight layer-based semidefinite relaxation for verifying local robust-n...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
We propose an enhanced semidefinite program (SDP) relaxation to enable the tight and efficient verif...
Despite being impactful on a variety of problems and applications, the generative adversarial nets (...
We propose an enhanced semidefinite program (SDP) relaxation to enable the tight and efficient verif...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
Since their invention, generative adversarial networks (GANs) have become a popular approach for lea...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
Stability certification and identifying a safe and stabilizing initial set are two important concern...
We introduce a novel method based on semidefinite program (SDP) for the tight and efficient verifica...
In this paper, we address the adversarial training of neural ODEs from a robust control perspective....
Many future technologies rely on neural networks, but verifying the correctness of their behavior re...
18 pages, 10 figures, 2 tablesInternational audienceThe Lipschitz constant of a network plays an imp...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
Exciting new work on the generalization bounds for neural networks (NN) givenby Neyshabur et al. ,...
We introduce an efficient and tight layer-based semidefinite relaxation for verifying local robust-n...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...