© Copyright 2016, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. Recent years have witnessed the success of deep neural networks in dealing with a plenty of practical problems. The invention of effective training techniques largely contributes to this success. The so-called "Dropout" training scheme is one of the most powerful tool to reduce over-fitting. From the statistic point of view, Dropout works by implicitly imposing an L2 regularizer on the weights. In this paper, we present a new training scheme: Shakeout. Instead of randomly discarding units as Dropout does at the training stage, our method randomly chooses to enhance or inverse the contributions of each unit to the next layer. We ...
Recently, it was shown that deep neural networks perform very well if the activities of hidden units...
In recent years, deep neural networks have become the state-of-the art in many machine learning doma...
Dropout regularization of deep neural networks has been a mysterious yet effective tool to prevent o...
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practica...
© 1979-2012 IEEE. Recent years have witnessed the success of deep neural networks in dealing with a ...
Regularization is essential when training large neural networks. As deep neural networks can be math...
The undeniable computational power of artificial neural networks has granted the scientific communit...
Deep neural nets with a large number of parameters are very powerful machine learning systems. Howev...
Recently it has been shown that when training neural networks on a limited amount of data, randomly ...
Deep neural networks often consist of a great number of trainable parameters for extracting powerful...
Despite powerful representation ability, deep neural networks (DNNs) are prone to over-fitting, beca...
Deep neural networks often consist of a great number of trainable parameters for extracting powerful...
Despite powerful representation ability, deep neural networks (DNNs) are prone to over-fitting, beca...
We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regular-izing large...
Dropout is one of the most popular regularization methods used in deep learning. The general form of...
Recently, it was shown that deep neural networks perform very well if the activities of hidden units...
In recent years, deep neural networks have become the state-of-the art in many machine learning doma...
Dropout regularization of deep neural networks has been a mysterious yet effective tool to prevent o...
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practica...
© 1979-2012 IEEE. Recent years have witnessed the success of deep neural networks in dealing with a ...
Regularization is essential when training large neural networks. As deep neural networks can be math...
The undeniable computational power of artificial neural networks has granted the scientific communit...
Deep neural nets with a large number of parameters are very powerful machine learning systems. Howev...
Recently it has been shown that when training neural networks on a limited amount of data, randomly ...
Deep neural networks often consist of a great number of trainable parameters for extracting powerful...
Despite powerful representation ability, deep neural networks (DNNs) are prone to over-fitting, beca...
Deep neural networks often consist of a great number of trainable parameters for extracting powerful...
Despite powerful representation ability, deep neural networks (DNNs) are prone to over-fitting, beca...
We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regular-izing large...
Dropout is one of the most popular regularization methods used in deep learning. The general form of...
Recently, it was shown that deep neural networks perform very well if the activities of hidden units...
In recent years, deep neural networks have become the state-of-the art in many machine learning doma...
Dropout regularization of deep neural networks has been a mysterious yet effective tool to prevent o...