This paper discusses properties of activation functions in multilayer neural network applied to pattern classification. A rule of thumb for selecting activation functions or their combination is proposed. The sigmoid, Gaussian and sinusoidal functions are selected due to their independent and fundamental space division properties. The sigmoid function is not effective for a single hidden unit. On the contrary, the other functions can provide good performance. When several hidden units are employed, the sigmoid function is useful. However, the convergence speed is still slower than the others. The Gaussian function is sensitive to the additive noise, while the others are rather insensitive. As a result, based on convergence rates, the minimu...
Non-linear activation functions are integral parts of deep neural architectures. Given the large and...
The generalization capabilities of deep neural networks are not well understood, and in particular, ...
A common practice for developing a Neural Network architecture is to build models in which each laye...
金沢大学理工研究域電子情報学系An optimization method of activation functions is proposed. Three typical functions a...
This article discusses a number of reasons why the use of non-monotonic functions as activation func...
© 2018 IEEE. Artificial feedforward neural networks for simple objects recognition of different conf...
Multi-Valued Neuron (MVN) was proposed for pattern classification. It operates with complex-va-lued ...
Multilayer perceptron network (MLP) has been recognized as a powerful tool for many applications inc...
The performance of two algorithms may be compared using an asymptotic technique in algorithm analysi...
Multilayer perceptron network (MLP) has been recognized as a powerful tool for many applications inc...
The activation function used to transform the activation level of a unit (neuron) into an output sig...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
In this paper a new neural network architecture, based on an adaptive activation function, called ge...
Signal classification performance using multilayer neural network (MLNN) and the conventional signal...
Background:- Artificial Neural networks are motivated from biological nervous system and can be used...
Non-linear activation functions are integral parts of deep neural architectures. Given the large and...
The generalization capabilities of deep neural networks are not well understood, and in particular, ...
A common practice for developing a Neural Network architecture is to build models in which each laye...
金沢大学理工研究域電子情報学系An optimization method of activation functions is proposed. Three typical functions a...
This article discusses a number of reasons why the use of non-monotonic functions as activation func...
© 2018 IEEE. Artificial feedforward neural networks for simple objects recognition of different conf...
Multi-Valued Neuron (MVN) was proposed for pattern classification. It operates with complex-va-lued ...
Multilayer perceptron network (MLP) has been recognized as a powerful tool for many applications inc...
The performance of two algorithms may be compared using an asymptotic technique in algorithm analysi...
Multilayer perceptron network (MLP) has been recognized as a powerful tool for many applications inc...
The activation function used to transform the activation level of a unit (neuron) into an output sig...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
In this paper a new neural network architecture, based on an adaptive activation function, called ge...
Signal classification performance using multilayer neural network (MLNN) and the conventional signal...
Background:- Artificial Neural networks are motivated from biological nervous system and can be used...
Non-linear activation functions are integral parts of deep neural architectures. Given the large and...
The generalization capabilities of deep neural networks are not well understood, and in particular, ...
A common practice for developing a Neural Network architecture is to build models in which each laye...