International audienceWe propose in this work a new unsupervised training procedure that is most effective when it is applied after supervised training and fine-tuning of deep neural network classifiers. While standard regularization techniques combat overfitting by means that are unrelated to the target classification loss, such as by minimizing the L2 norm or by adding noise either in the data, model or process, the proposed unsupervised training loss reduces overfitting by optimizing the true classifier risk. The proposed approach is evaluated on several tasks of increasing difficulty and varying conditions: unsupervised training, posttuning and anomaly detection. It is also tested both on simple neural networks, such as small multi-laye...
During minibatch gradient-based optimization, the contribution of observations to the updating of th...
We propose a novel regularizer for supervised learning called Conditioning on Noisy Targets (CNT). T...
Despite powerful representation ability, deep neural networks (DNNs) are prone to over-fitting, beca...
International audienceModern neural networks are over-parametrized. In particular, each rectified li...
International audienceDeep learning architectures employ various regularization terms to handle diff...
As a dominant paradigm, fine-tuning a pre-trained model on the target data is widely used in many de...
University of Technology Sydney. Faculty of Engineering and Information Technology.Recent years have...
Recently, over-parameterized deep networks, with increasingly more network parameters than training ...
We consider transfer learning approaches that fine-tune a pretrained deep neural network on a target...
Deep learning is a field of research attracting nowadays much attention, mainly because deep archite...
Overfitting is one issue that deep learning faces in particular. It leads to highly accurate classif...
Thesis (Ph.D.)--University of Washington, 2020Machine learning using deep neural networks -- also ca...
Deep Neural Networks ("deep learning") have become a ubiquitous choice of algorithms for Machine Lea...
When a large feedforward neural network is trained on a small training set, it typically performs po...
Unlearning the data observed during the training of a machine learning (ML) model is an important ta...
During minibatch gradient-based optimization, the contribution of observations to the updating of th...
We propose a novel regularizer for supervised learning called Conditioning on Noisy Targets (CNT). T...
Despite powerful representation ability, deep neural networks (DNNs) are prone to over-fitting, beca...
International audienceModern neural networks are over-parametrized. In particular, each rectified li...
International audienceDeep learning architectures employ various regularization terms to handle diff...
As a dominant paradigm, fine-tuning a pre-trained model on the target data is widely used in many de...
University of Technology Sydney. Faculty of Engineering and Information Technology.Recent years have...
Recently, over-parameterized deep networks, with increasingly more network parameters than training ...
We consider transfer learning approaches that fine-tune a pretrained deep neural network on a target...
Deep learning is a field of research attracting nowadays much attention, mainly because deep archite...
Overfitting is one issue that deep learning faces in particular. It leads to highly accurate classif...
Thesis (Ph.D.)--University of Washington, 2020Machine learning using deep neural networks -- also ca...
Deep Neural Networks ("deep learning") have become a ubiquitous choice of algorithms for Machine Lea...
When a large feedforward neural network is trained on a small training set, it typically performs po...
Unlearning the data observed during the training of a machine learning (ML) model is an important ta...
During minibatch gradient-based optimization, the contribution of observations to the updating of th...
We propose a novel regularizer for supervised learning called Conditioning on Noisy Targets (CNT). T...
Despite powerful representation ability, deep neural networks (DNNs) are prone to over-fitting, beca...