Deep Learning (read neural networks) has emerged as one of the most exciting and powerful tools in the statistical machine learning toolbox. Neural networks over the past decade have defeated incumbent state of the art models in computer vision, natural language processing, and reinforcement learning; amongst a myriad of other fields. Though these achievements are impressive, neural networks remain surprisingly brittle, especially in the case of smaller network architectures. They can easily overfit, yielding poor performance on unseen data, and are remarkably susceptible to small perturbations and noise on their inputs. White noise on images has been shown to consistently alter the predictions of neural networks trained for classification....
It took until the last decade to finally see a machine match human performance on essentially any ta...
Today the amount of applications which use Neural Networks is increasing every day. The scope of us...
This work explores the impact of various design and training choices on the resilience of a neural n...
Deep learning has seen tremendous growth, largely fueled by more powerful computers, the availabilit...
Neural Networks are prone to having lesser accuracy in the classification of images with noise pertu...
Deep learning plays an important role in various disciplines, such as auto-driving, information tech...
This paper presents a novel framework for image classification which comprises a convolutional neura...
In this thesis, we study the robustness and generalization properties of Deep Neural Networks (DNNs)...
The reliability of deep learning algorithms is fundamentally challenged by the existence of adversar...
Noise Injection consists in adding noise to the inputs during neural network training. Experimental ...
The understanding of generalization in machine learning is in a state of flux. This is partly due to...
International audienceGaussian noise injections (GNIs) are a family of simple and widely-used regula...
Despite superior accuracy on most vision recognition tasks, deep neural networks are susceptible to ...
State-of-the-art deep networks for image classification are vulnerable to adversarial examples—miscl...
Understanding the impact of data structure on the computational tractability of learning is a key ch...
It took until the last decade to finally see a machine match human performance on essentially any ta...
Today the amount of applications which use Neural Networks is increasing every day. The scope of us...
This work explores the impact of various design and training choices on the resilience of a neural n...
Deep learning has seen tremendous growth, largely fueled by more powerful computers, the availabilit...
Neural Networks are prone to having lesser accuracy in the classification of images with noise pertu...
Deep learning plays an important role in various disciplines, such as auto-driving, information tech...
This paper presents a novel framework for image classification which comprises a convolutional neura...
In this thesis, we study the robustness and generalization properties of Deep Neural Networks (DNNs)...
The reliability of deep learning algorithms is fundamentally challenged by the existence of adversar...
Noise Injection consists in adding noise to the inputs during neural network training. Experimental ...
The understanding of generalization in machine learning is in a state of flux. This is partly due to...
International audienceGaussian noise injections (GNIs) are a family of simple and widely-used regula...
Despite superior accuracy on most vision recognition tasks, deep neural networks are susceptible to ...
State-of-the-art deep networks for image classification are vulnerable to adversarial examples—miscl...
Understanding the impact of data structure on the computational tractability of learning is a key ch...
It took until the last decade to finally see a machine match human performance on essentially any ta...
Today the amount of applications which use Neural Networks is increasing every day. The scope of us...
This work explores the impact of various design and training choices on the resilience of a neural n...