International audienceThis paper analyses both nonlinear activation functions and spatial max-pooling for Deep Convolutional Neural Networks (DCNNs) by means of the algebraic basis of mathematical morphology. Additionally, a general family of activation functions is proposed by considering both max-pooling and nonlinear operators in the context of morphological representations. Experimental section validates the goodness of our approach on classical benchmarks for supervised learning by DCNN
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
Over the past decade, Convolutional Networks (ConvNets) have renewed the perspectives of the researc...
This paper focuses on the enhancement of the generalization ability and training stability of deep n...
International audienceIn the last ten years, Convolutional Neural Networks (CNNs) have formed the ba...
Activation functions provide deep neural networks the non-linearity that is necessary to learn compl...
International audienceFollowing recent advances in morphological neural networks, we propose to stud...
Convolutional Neural Networks (CNN’s) have proven to be an effective approach for solving image cl...
International audienceMotivated by recent advances in morphological neural networks, we further stud...
Neural networks and particularly Deep learning have been comparatively little studied from the theor...
Mathematical morphology is a theory and technique applied to collect features like geometric and top...
The activation function is the basic component of the convolutional neural network (CNN), which prov...
International audienceThe paper proposes a new class of nonlinear operators and a dual learning para...
Morphological neural networks (MNNs) can be characterized as a class of artificial neural networks t...
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...
Over the past decade, Convolutional Networks (ConvNets) have renewed the perspectives of the researc...
This paper focuses on the enhancement of the generalization ability and training stability of deep n...
International audienceIn the last ten years, Convolutional Neural Networks (CNNs) have formed the ba...
Activation functions provide deep neural networks the non-linearity that is necessary to learn compl...
International audienceFollowing recent advances in morphological neural networks, we propose to stud...
Convolutional Neural Networks (CNN’s) have proven to be an effective approach for solving image cl...
International audienceMotivated by recent advances in morphological neural networks, we further stud...
Neural networks and particularly Deep learning have been comparatively little studied from the theor...
Mathematical morphology is a theory and technique applied to collect features like geometric and top...
The activation function is the basic component of the convolutional neural network (CNN), which prov...
International audienceThe paper proposes a new class of nonlinear operators and a dual learning para...
Morphological neural networks (MNNs) can be characterized as a class of artificial neural networks t...
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
We introduce a variational framework to learn the activation functions of deep neural networks. Our ...