This dissertation explores compression techniques for neural networks to enable control of resource usage, accelerate training, and increase robustness against adversarial as well as out-of-distribution examples. The convolutional layers are core building blocks of neural network architectures. In general, a convolutional filter applies to the entire frequency spectrum of the input data. Our new band-limiting method artificially constrains the frequency spectra of these filters and data during training and inference. The frequency-domain constraints apply to both the feed-forward and back-propagation steps. Experimental results confirm that band-limited models can effectively control resource usage (GPU and memory). The band-limited method...
The entangled guardbands in terms of timing specification and energy budget ensure a system against ...
Deep Learning (read neural networks) has emerged as one of the most exciting and powerful tools in t...
Deep neural networks consume an excessive amount of hardware resources, making them difficult to dep...
Despite the advance in deep learning technology, assuring the robustness of deep neural networks (DN...
In this thesis, we illustrate via two case studies the utility of bottom-up signal modeling and proc...
Prior works applied singular value decomposition and dropout compression methods for fully-connected...
One major limitation of CNNs is that they are vulnerable to adversarial attacks. Currently, adversar...
International audienceIn this work, we propose a novel neural network architecture, called Adaptive ...
The prevalence and success of Deep Neural Network (DNN) applications in recent years have motivated ...
This work studies the sensitivity of neural networks to weight perturbations, firstly corresponding ...
International audienceLPWANs are networks characterized by the scarcity of their radio resources and...
In the last decade, deep neural networks have achieved tremendous success in many fields of machine ...
We propose Absum, which is a regularization method for improving adversarial robustness of convoluti...
The success of overparameterized deep neural networks (DNNs) poses a great challenge to deploy compu...
Convolutional Neural Networks (CNNs) are brain-inspired computational models designed to recognize p...
The entangled guardbands in terms of timing specification and energy budget ensure a system against ...
Deep Learning (read neural networks) has emerged as one of the most exciting and powerful tools in t...
Deep neural networks consume an excessive amount of hardware resources, making them difficult to dep...
Despite the advance in deep learning technology, assuring the robustness of deep neural networks (DN...
In this thesis, we illustrate via two case studies the utility of bottom-up signal modeling and proc...
Prior works applied singular value decomposition and dropout compression methods for fully-connected...
One major limitation of CNNs is that they are vulnerable to adversarial attacks. Currently, adversar...
International audienceIn this work, we propose a novel neural network architecture, called Adaptive ...
The prevalence and success of Deep Neural Network (DNN) applications in recent years have motivated ...
This work studies the sensitivity of neural networks to weight perturbations, firstly corresponding ...
International audienceLPWANs are networks characterized by the scarcity of their radio resources and...
In the last decade, deep neural networks have achieved tremendous success in many fields of machine ...
We propose Absum, which is a regularization method for improving adversarial robustness of convoluti...
The success of overparameterized deep neural networks (DNNs) poses a great challenge to deploy compu...
Convolutional Neural Networks (CNNs) are brain-inspired computational models designed to recognize p...
The entangled guardbands in terms of timing specification and energy budget ensure a system against ...
Deep Learning (read neural networks) has emerged as one of the most exciting and powerful tools in t...
Deep neural networks consume an excessive amount of hardware resources, making them difficult to dep...