Deep neural networks often lack the safety and robustness guarantees needed to be deployed in safety critical systems. Formal verification techniques can be used to prove input-output safety properties of networks, but when properties are difficult to specify, we rely on the solution to various optimization problems. In this work, we present an algorithm called ZoPE that solves optimization problems over the output of feedforward ReLU networks with low-dimensional inputs. The algorithm eagerly splits the input space, bounding the objective using zonotope propagation at each step, and improves computational efficiency compared to existing mixed-integer programming approaches. We demonstrate how to formulate and solve three types of optimizat...
This paper aims to accelerate the test-time computation of deep convolutional neural networks (CNNs)...
The rapid growth of deep learning applications in real life is accompanied by severe safety concerns...
Thesis (Ph.D.)--University of Washington, 2019The advent of deep neural networks has revolutionized ...
We study optimization problems where the objective function is modeled through feedforward neural ne...
We develop fast algorithms and robust software for convex optimization of two-layer neural networks ...
Understanding the fundamental principles behind the success of deep neural networks is one of the mo...
The success of deep learning has shown impressive empirical breakthroughs, but many theoretical ques...
© 35th International Conference on Machine Learning, ICML 2018.All Rights Reserved. Verifying the r...
We solve an open question from Lu et al. (2017), by showing that any target network with inputs in $...
This work finds the analytical expression of the global minima of a deep linear network with weight ...
Neural networks(NNs) have been widely used over the past decade at the core of intelligentsystems fr...
Analyzing the worst-case performance of deep neural networks against input perturbations amounts to ...
For most state-of-the-art architectures, Rectified Linear Unit (ReLU) becomes a standard component a...
Understanding the computational complexity of training simple neural networks with rectified linear ...
The landscape of the empirical risk of overparametrized deep convolutional neural networks (DCNNs) i...
This paper aims to accelerate the test-time computation of deep convolutional neural networks (CNNs)...
The rapid growth of deep learning applications in real life is accompanied by severe safety concerns...
Thesis (Ph.D.)--University of Washington, 2019The advent of deep neural networks has revolutionized ...
We study optimization problems where the objective function is modeled through feedforward neural ne...
We develop fast algorithms and robust software for convex optimization of two-layer neural networks ...
Understanding the fundamental principles behind the success of deep neural networks is one of the mo...
The success of deep learning has shown impressive empirical breakthroughs, but many theoretical ques...
© 35th International Conference on Machine Learning, ICML 2018.All Rights Reserved. Verifying the r...
We solve an open question from Lu et al. (2017), by showing that any target network with inputs in $...
This work finds the analytical expression of the global minima of a deep linear network with weight ...
Neural networks(NNs) have been widely used over the past decade at the core of intelligentsystems fr...
Analyzing the worst-case performance of deep neural networks against input perturbations amounts to ...
For most state-of-the-art architectures, Rectified Linear Unit (ReLU) becomes a standard component a...
Understanding the computational complexity of training simple neural networks with rectified linear ...
The landscape of the empirical risk of overparametrized deep convolutional neural networks (DCNNs) i...
This paper aims to accelerate the test-time computation of deep convolutional neural networks (CNNs)...
The rapid growth of deep learning applications in real life is accompanied by severe safety concerns...
Thesis (Ph.D.)--University of Washington, 2019The advent of deep neural networks has revolutionized ...