International audienceThe paper proposes a new class of nonlinear operators and a dual learning paradigm where optimization jointly concerns both linear convolutional weights and the parameters of these nonlinear operators. The nonlinear class proposed to perform a rich functional representation is composed by functions called rectified parametric sigmoid units. This class is constructed to benefit from the advantages of both sigmoid and rectified linear unit functions, while rejecting their respective drawbacks. Moreover, the analytic form of this new neural class involves scale, shift and shape parameters to obtain a wide range of activation shapes, including the standard rectified linear unit as a limit case. Parameters of this neural tr...
Rectified linear units (ReLUs) are currently the most popular activation function used in neural net...
Transfer functions play a very important role in learning process of neural systems. This paper pres...
Nonlinear system identification and prediction is a complex task, and often non-parametric models su...
Rectified linear activation units are important components for state-of-the-art deep convolutional n...
Approximation of highly nonlinear functions is an important area of computational intelligence. The ...
Network training algorithms have heavily concentrated on the learning of connection weights. Little ...
The Exponential Linear Unit (ELU) has been proven to speed up learning and improve the classificatio...
Non-linear activation functions are integral parts of deep neural architectures. Given the large and...
Abstract. In this paper we propose and investigate a novel nonlinear unit, called Lp unit, for deep ...
Activation function is a key component in deep learning that performs non-linear mappings between th...
QActivation function is a key component in deep learning that performs non-linear mappings between t...
The activation function is the basic component of the convolutional neural network (CNN), which prov...
International audienceThis paper analyses both nonlinear activation functions and spatial max-poolin...
The parameter space of neural networks has the Riemannian metric structure. The natural Riemannian g...
AbstractWe investigate the computational power of recurrent neural networks that apply the sigmoid a...
Rectified linear units (ReLUs) are currently the most popular activation function used in neural net...
Transfer functions play a very important role in learning process of neural systems. This paper pres...
Nonlinear system identification and prediction is a complex task, and often non-parametric models su...
Rectified linear activation units are important components for state-of-the-art deep convolutional n...
Approximation of highly nonlinear functions is an important area of computational intelligence. The ...
Network training algorithms have heavily concentrated on the learning of connection weights. Little ...
The Exponential Linear Unit (ELU) has been proven to speed up learning and improve the classificatio...
Non-linear activation functions are integral parts of deep neural architectures. Given the large and...
Abstract. In this paper we propose and investigate a novel nonlinear unit, called Lp unit, for deep ...
Activation function is a key component in deep learning that performs non-linear mappings between th...
QActivation function is a key component in deep learning that performs non-linear mappings between t...
The activation function is the basic component of the convolutional neural network (CNN), which prov...
International audienceThis paper analyses both nonlinear activation functions and spatial max-poolin...
The parameter space of neural networks has the Riemannian metric structure. The natural Riemannian g...
AbstractWe investigate the computational power of recurrent neural networks that apply the sigmoid a...
Rectified linear units (ReLUs) are currently the most popular activation function used in neural net...
Transfer functions play a very important role in learning process of neural systems. This paper pres...
Nonlinear system identification and prediction is a complex task, and often non-parametric models su...