Private inference on neural networks requires running all the computation on encrypted data. Unfortunately, neural networks contain a large number of non-arithmetic operations, such as ReLU activation functions and max pooling layers, which incur a high latency cost in their encrypted form. To address this issue, the majority of private inference methods replace some or all of the non-arithmetic operations with a polynomial approximation. This step introduces approximation errors that can substantially alter the output of the neural network and decrease its predictive performance. In this paper, we propose a Lipschitz-Guided Abstraction Refinement method (LiGAR), which provides strong guarantees on the global approximation error. Our method...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
Advancements in machine learning (ML) algorithms, data acquisition platforms, and high-end computer ...
We contribute to a better understanding of the class of functions that can be represented by a neura...
Private computation of nonlinear functions, such as Rectified Linear Units (ReLUs) and max-pooling o...
State-of-the-art approaches for training Differentially Private (DP) Deep Neural Networks (DNN) face...
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bo...
Private inference (PI) enables inference directly on cryptographically secure data.While promising t...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
The desire to provide robust guarantees on neural networks has never been more important, as their p...
Outsourced inference service has enormously promoted the popularity of deep learning, and helped use...
In recent years, deep learning has become an increasingly popular approach to modelling data, due to...
18 pages, 10 figures, 2 tablesInternational audienceThe Lipschitz constant of a network plays an imp...
The processing of sensitive user data using deep learning models is an area that has gained recent t...
Neural networks (NN), one type of machine learning (ML) algorithms, have emerged as a powerful parad...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
Advancements in machine learning (ML) algorithms, data acquisition platforms, and high-end computer ...
We contribute to a better understanding of the class of functions that can be represented by a neura...
Private computation of nonlinear functions, such as Rectified Linear Units (ReLUs) and max-pooling o...
State-of-the-art approaches for training Differentially Private (DP) Deep Neural Networks (DNN) face...
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bo...
Private inference (PI) enables inference directly on cryptographically secure data.While promising t...
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with r...
The desire to provide robust guarantees on neural networks has never been more important, as their p...
Outsourced inference service has enormously promoted the popularity of deep learning, and helped use...
In recent years, deep learning has become an increasingly popular approach to modelling data, due to...
18 pages, 10 figures, 2 tablesInternational audienceThe Lipschitz constant of a network plays an imp...
The processing of sensitive user data using deep learning models is an area that has gained recent t...
Neural networks (NN), one type of machine learning (ML) algorithms, have emerged as a powerful parad...
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschi...
The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient...
Advancements in machine learning (ML) algorithms, data acquisition platforms, and high-end computer ...
We contribute to a better understanding of the class of functions that can be represented by a neura...