We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regular-izing large fully-connected layers within neu-ral networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropCon-nect instead sets a randomly selected sub-set of weights within the network to zero. Each unit thus receives input from a ran-dom subset of units in the previous layer. We derive a bound on the generalization per-formance of both Dropout and DropCon-nect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recognition benchmarks by aggregating mul-tiple DropConnect-trained models. 1
In this work, we propose a novel method named Weighted Channel Dropout (WCD) for the regularization ...
As universal function approximators, neural networks have been successfully used for nonlinear dynam...
This paper presents a novel approach to recurrent neural network (RNN) regularization. Differently f...
DropConnect is a recently introduced algorithm to prevent the co-adaptation of feature detectors. Co...
Regularization is essential when training large neural networks. As deep neural networks can be math...
Neural networks are often over-parameterized and hence benefit from aggressive regularization. Conve...
In this paper, we propose an extension of the Extreme Learning Machine algorithm for Single-hidden L...
Deep neural nets with a large number of parameters are very powerful machine learning systems. Howev...
Dropout as a regularization technique is widely used in fully connected layers while is less effecti...
Dropout as a regularization technique is widely used in fully connected layers while is less effecti...
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practica...
The undeniable computational power of artificial neural networks has granted the scientific communit...
© 1979-2012 IEEE. Recent years have witnessed the success of deep neural networks in dealing with a ...
© Copyright 2016, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
Dropout is one of the most popular regularization methods used in deep learning. The general form of...
In this work, we propose a novel method named Weighted Channel Dropout (WCD) for the regularization ...
As universal function approximators, neural networks have been successfully used for nonlinear dynam...
This paper presents a novel approach to recurrent neural network (RNN) regularization. Differently f...
DropConnect is a recently introduced algorithm to prevent the co-adaptation of feature detectors. Co...
Regularization is essential when training large neural networks. As deep neural networks can be math...
Neural networks are often over-parameterized and hence benefit from aggressive regularization. Conve...
In this paper, we propose an extension of the Extreme Learning Machine algorithm for Single-hidden L...
Deep neural nets with a large number of parameters are very powerful machine learning systems. Howev...
Dropout as a regularization technique is widely used in fully connected layers while is less effecti...
Dropout as a regularization technique is widely used in fully connected layers while is less effecti...
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practica...
The undeniable computational power of artificial neural networks has granted the scientific communit...
© 1979-2012 IEEE. Recent years have witnessed the success of deep neural networks in dealing with a ...
© Copyright 2016, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
Dropout is one of the most popular regularization methods used in deep learning. The general form of...
In this work, we propose a novel method named Weighted Channel Dropout (WCD) for the regularization ...
As universal function approximators, neural networks have been successfully used for nonlinear dynam...
This paper presents a novel approach to recurrent neural network (RNN) regularization. Differently f...