A comprehensive review on the problem of choosing a suitable activation function for the hidden layer of a feed forward neural network has been widely investigated. Since the nonlinear component of a neural network is the main contributor to the network mapping capabilities, the different choices that may lead to enhanced performances, in terms of training, generalization, or computational costs, are analyzed, both in general-purpose and in embedded computing environments. Finally, a strategy to convert a network configuration between different activation functions without altering the network mapping capabilities will be presented
Neural networks are constructed and trained on powerful workstations. For real world applications, h...
The weight initialization and the activation function of deep neural networks have a crucial impact ...
Traditional supervised neural network trainers have deviated little from the fundamental back propag...
Copyright © 2015 Antonino Laudani et al. This is an open access article distributed under the Creati...
A comprehensive review on the problem of choosing a suitable activation function for the hidden laye...
In neural networks literature, there is a strong interest in identifying and defining activation fun...
Artificial neural networks are function-approximating models that can improve themselves with experi...
The activation function deployed in a deep neural network has great influence on the performance of ...
Neural networks are computing systems modelled after the biological neural network of animal brain a...
Constructive learning algorithms are an efficient way to train feedforward neural networks. Some of ...
Abstract. We present an analysis of the computational capabilities of feed-forward neural networks f...
Artificial Neural Networks (ANNs) are one of the most comprehensive tools for  classification. In t...
The training of multilayer perceptron is generally a difficult task. Excessive training times and la...
This paper presents a theoretical analysis and some experimental results concerning the effects of b...
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
Neural networks are constructed and trained on powerful workstations. For real world applications, h...
The weight initialization and the activation function of deep neural networks have a crucial impact ...
Traditional supervised neural network trainers have deviated little from the fundamental back propag...
Copyright © 2015 Antonino Laudani et al. This is an open access article distributed under the Creati...
A comprehensive review on the problem of choosing a suitable activation function for the hidden laye...
In neural networks literature, there is a strong interest in identifying and defining activation fun...
Artificial neural networks are function-approximating models that can improve themselves with experi...
The activation function deployed in a deep neural network has great influence on the performance of ...
Neural networks are computing systems modelled after the biological neural network of animal brain a...
Constructive learning algorithms are an efficient way to train feedforward neural networks. Some of ...
Abstract. We present an analysis of the computational capabilities of feed-forward neural networks f...
Artificial Neural Networks (ANNs) are one of the most comprehensive tools for  classification. In t...
The training of multilayer perceptron is generally a difficult task. Excessive training times and la...
This paper presents a theoretical analysis and some experimental results concerning the effects of b...
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
Neural networks are constructed and trained on powerful workstations. For real world applications, h...
The weight initialization and the activation function of deep neural networks have a crucial impact ...
Traditional supervised neural network trainers have deviated little from the fundamental back propag...