Rectified linear activation units are important components for state-of-the-art deep convolutional networks. In this paper, we propose a novel S-shaped rectifiedlinear activation unit (SReLU) to learn both convexand non-convex functions, imitating the multiple function forms given by the two fundamental laws, namely the Webner-Fechner law and the Stevens law, in psychophysics and neural sciences. Specifically, SReLU consists of three piecewise linear functions, which are formulated by four learnable parameters. The SReLU is learned jointly with the training of the whole deep network through back propagation. During the training phase, to initialize SReLU in different layers, we propose a “freezing” method to degenerate SReLU into a predefin...
International audienceThe paper proposes a new class of nonlinear operators and a dual learning para...
By applying concepts from the statistical physics of learning, we study layered neural networks of r...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
Activation functions provide deep neural networks the non-linearity that is necessary to learn compl...
QActivation function is a key component in deep learning that performs non-linear mappings between t...
Activation function is a key component in deep learning that performs non-linear mappings between th...
This paper focuses on the enhancement of the generalization ability and training stability of deep n...
Activation functions are essential for deep learning methods to learn and perform complex tasks such...
The activation function plays a key role in influencing the performance and training dynamics of neur...
In this paper, we introduce a novel type of Rectified Linear Unit (ReLU), called a Dual Rectified Li...
© 2017 IEEE. Deep Belief Network (DBN) is made up of stacked Restricted Boltzmann Machine layers ass...
Rectified linear units (ReLUs) are currently the most popular activation function used in neural net...
Abstract. In this paper we propose and investigate a novel nonlinear unit, called Lp unit, for deep ...
Deep feedforward neural networks with piecewise linear activations are currently producing the state...
In deep learning models, the inputs to the network are processed using activation functions to gener...
International audienceThe paper proposes a new class of nonlinear operators and a dual learning para...
By applying concepts from the statistical physics of learning, we study layered neural networks of r...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
Activation functions provide deep neural networks the non-linearity that is necessary to learn compl...
QActivation function is a key component in deep learning that performs non-linear mappings between t...
Activation function is a key component in deep learning that performs non-linear mappings between th...
This paper focuses on the enhancement of the generalization ability and training stability of deep n...
Activation functions are essential for deep learning methods to learn and perform complex tasks such...
The activation function plays a key role in influencing the performance and training dynamics of neur...
In this paper, we introduce a novel type of Rectified Linear Unit (ReLU), called a Dual Rectified Li...
© 2017 IEEE. Deep Belief Network (DBN) is made up of stacked Restricted Boltzmann Machine layers ass...
Rectified linear units (ReLUs) are currently the most popular activation function used in neural net...
Abstract. In this paper we propose and investigate a novel nonlinear unit, called Lp unit, for deep ...
Deep feedforward neural networks with piecewise linear activations are currently producing the state...
In deep learning models, the inputs to the network are processed using activation functions to gener...
International audienceThe paper proposes a new class of nonlinear operators and a dual learning para...
By applying concepts from the statistical physics of learning, we study layered neural networks of r...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...