In this work, we propose a novel method named Weighted Channel Dropout (WCD) for the regularization of deep Convolutional Neural Network (CNN). Different from Dropout which randomly selects the neurons to set to zero in the fully-connected layers, WCD operates on the channels in the stack of convolutional layers. Specifically, WCD consists of two steps, i.e., Rating Channels and Selecting Channels, and three modules, i.e., Global Average Pooling, Weighted Random Selection and Random Number Generator. It filters the channels according to their activation status and can be plugged into any two consecutive layers, which unifies the original Dropout and Channel-Wise Dropout. WCD is totally parameter-free and deployed only in training phase with...
Deep learning-based approaches have been paramount in recent years, mainly due to their outstanding ...
Dropout is one of the most popular regularization methods used in deep learning. The general form of...
Dropout regularization of deep neural networks has been a mysterious yet effective tool to prevent o...
Deep neural nets with a large number of parameters are very powerful machine learning systems. Howev...
Dropout as a regularization technique is widely used in fully connected layers while is less effecti...
Dropout as a regularization technique is widely used in fully connected layers while is less effecti...
The undeniable computational power of artificial neural networks has granted the scientific communit...
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practica...
© Copyright 2016, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
© 1979-2012 IEEE. Recent years have witnessed the success of deep neural networks in dealing with a ...
We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regular-izing large...
Neural networks are often over-parameterized and hence benefit from aggressive regularization. Conve...
We introduce a simple and effective method for regularizing large convolutional neural networks. We ...
Regularization is essential when training large neural networks. As deep neural networks can be math...
Convolutional neural networks (CNNs) were inspired by biology. They are hierarchical neural network...
Deep learning-based approaches have been paramount in recent years, mainly due to their outstanding ...
Dropout is one of the most popular regularization methods used in deep learning. The general form of...
Dropout regularization of deep neural networks has been a mysterious yet effective tool to prevent o...
Deep neural nets with a large number of parameters are very powerful machine learning systems. Howev...
Dropout as a regularization technique is widely used in fully connected layers while is less effecti...
Dropout as a regularization technique is widely used in fully connected layers while is less effecti...
The undeniable computational power of artificial neural networks has granted the scientific communit...
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practica...
© Copyright 2016, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
© 1979-2012 IEEE. Recent years have witnessed the success of deep neural networks in dealing with a ...
We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regular-izing large...
Neural networks are often over-parameterized and hence benefit from aggressive regularization. Conve...
We introduce a simple and effective method for regularizing large convolutional neural networks. We ...
Regularization is essential when training large neural networks. As deep neural networks can be math...
Convolutional neural networks (CNNs) were inspired by biology. They are hierarchical neural network...
Deep learning-based approaches have been paramount in recent years, mainly due to their outstanding ...
Dropout is one of the most popular regularization methods used in deep learning. The general form of...
Dropout regularization of deep neural networks has been a mysterious yet effective tool to prevent o...