Deep feedforward neural networks with piecewise linear activations are currently producing the state-of-the-art results in several public datasets (e.g., CIFAR-10, CIFAR- 100, MNIST, and SVHN). The combination of deep learning models and piecewise linear activation functions allows for the estimation of exponentially complex functions with the use of a large number of subnetworks specialized in the classification of similar input examples. During the training process, these subnetworks avoid overfitting with an implicit regularization scheme based on the fact that they must share their parameters with other subnetworks. Using this framework, we have made an empirical observation that can improve even more the performance of such models. We ...
International audienceThis paper underlines a subtle property of batch-normalization (BN): Successiv...
Activation function is a key component in deep learning that performs non-linear mappings between th...
Deep learning, the study of multi-layered artificial neural networks, has received tremendous attent...
The activation function deployed in a deep neural network has great influence on the performance of ...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
Numerous approaches address over-fitting in neural networks: by imposing a penalty on the parameters...
Effective regularisation of neural networks is essential to combat overfitting due to the large numb...
Abstract. In this paper we propose and investigate a novel nonlinear unit, called Lp unit, for deep ...
This paper focuses on the enhancement of the generalization ability and training stability of deep n...
Deep Learning in the field of Big Data has become essential for the analysis and perception of trend...
The weight initialization and the activation function of deep neural networks have a crucial impact ...
Adversarial training has been shown to regularize deep neural networks in addition to increasing the...
© 1979-2012 IEEE. Recent years have witnessed the success of deep neural networks in dealing with a ...
Activation functions are crucial in deep learning networks, given that the nonlinear ability of acti...
Activation functions provide deep neural networks the non-linearity that is necessary to learn compl...
International audienceThis paper underlines a subtle property of batch-normalization (BN): Successiv...
Activation function is a key component in deep learning that performs non-linear mappings between th...
Deep learning, the study of multi-layered artificial neural networks, has received tremendous attent...
The activation function deployed in a deep neural network has great influence on the performance of ...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
Numerous approaches address over-fitting in neural networks: by imposing a penalty on the parameters...
Effective regularisation of neural networks is essential to combat overfitting due to the large numb...
Abstract. In this paper we propose and investigate a novel nonlinear unit, called Lp unit, for deep ...
This paper focuses on the enhancement of the generalization ability and training stability of deep n...
Deep Learning in the field of Big Data has become essential for the analysis and perception of trend...
The weight initialization and the activation function of deep neural networks have a crucial impact ...
Adversarial training has been shown to regularize deep neural networks in addition to increasing the...
© 1979-2012 IEEE. Recent years have witnessed the success of deep neural networks in dealing with a ...
Activation functions are crucial in deep learning networks, given that the nonlinear ability of acti...
Activation functions provide deep neural networks the non-linearity that is necessary to learn compl...
International audienceThis paper underlines a subtle property of batch-normalization (BN): Successiv...
Activation function is a key component in deep learning that performs non-linear mappings between th...
Deep learning, the study of multi-layered artificial neural networks, has received tremendous attent...