Standard feedforward neural networks benefit from the nice theoretical properties of mixtures of sigmoid activation functions, but they may fail in several practical learning tasks. These tasks would be better faced by relying on a more appropriate, problem-specific basis of activation functions. The paper presents a connectionist model which exploits adaptive activation functions. Each hidden unit in the network is associated with a specific pair (f(·),p(·)), where f(·) is the activation function and p(·) is the likelihood of the unit being relevant to the computation of the network output over the current input. The function f(·) is optimized in a supervised manner, while p(·) is realized via a statistical parametric model learned through...
International audienceSpiking Neuron Networks (SNNs) are often referred to as the 3rd generation ofn...
We investigate a novel neural network model which uses stochastic weights. It is shown that the func...
Neuroevolution methods evolve the weights of a neural network, and in some cases the topology, but l...
Standard feedforward neural networks benefit from the nice theoretical properties of mixtures of sig...
In neural networks literature, there is a strong interest in identifying and defining activation fun...
Network training algorithms have heavily concentrated on the learning of connection weights. Little ...
In the paper, an ontogenic artificial neural network (ANNs) is proposed. The network uses orthogonal...
The topic of supervised learning within the conceptual framework of artificial neural network (ANN) ...
An activation function, possibly new, is proposed for use in digital simulation of arti#cial neural ...
Network training algorithms have heavily concentrated on the learning of connection weights. Little ...
This dissertation presents a new strategy for the automatic design of neural networks. The learning ...
This paper focuses on multilayer perceptron neural networks where the activation functions are adapt...
A comprehensive review on the problem of choosing a suitable activation function for the hidden laye...
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
Considering computational algorithms available in the literature, associated with supervised learnin...
International audienceSpiking Neuron Networks (SNNs) are often referred to as the 3rd generation ofn...
We investigate a novel neural network model which uses stochastic weights. It is shown that the func...
Neuroevolution methods evolve the weights of a neural network, and in some cases the topology, but l...
Standard feedforward neural networks benefit from the nice theoretical properties of mixtures of sig...
In neural networks literature, there is a strong interest in identifying and defining activation fun...
Network training algorithms have heavily concentrated on the learning of connection weights. Little ...
In the paper, an ontogenic artificial neural network (ANNs) is proposed. The network uses orthogonal...
The topic of supervised learning within the conceptual framework of artificial neural network (ANN) ...
An activation function, possibly new, is proposed for use in digital simulation of arti#cial neural ...
Network training algorithms have heavily concentrated on the learning of connection weights. Little ...
This dissertation presents a new strategy for the automatic design of neural networks. The learning ...
This paper focuses on multilayer perceptron neural networks where the activation functions are adapt...
A comprehensive review on the problem of choosing a suitable activation function for the hidden laye...
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
Considering computational algorithms available in the literature, associated with supervised learnin...
International audienceSpiking Neuron Networks (SNNs) are often referred to as the 3rd generation ofn...
We investigate a novel neural network model which uses stochastic weights. It is shown that the func...
Neuroevolution methods evolve the weights of a neural network, and in some cases the topology, but l...