This thesis presents the use of a new sigmoid activation function in backpropagation artificial neural networks (ANNs). ANNs using conventional activation functions may generalize poorly when trained on a set which includes quirky, mislabeled, unbalanced, or otherwise complicated data. This new activation function is an attempt to improve generalization and reduce overtraining on mislabeled or irrelevant data by restricting training when inputs to the hidden neurons are sufficiently small. This activation function includes a flattened, low-training region which grows or shrinks during back-propagation to ensure a desired proportion of inputs inside the low-training region. With a desired low-training proportion of 0, this activation functio...
The performance of two algorithms may be compared using an asymptotic technique in algorithm analysi...
Traditional supervised neural network trainers have deviated little from the fundamental back propag...
In this paper the Sigma-if artificial neural network model is considered, which is a generalization ...
Activation functions are an essential part of artificial neural networks. Over the years, researches...
This paper presents some simple techniques to improve the backpropagation algorithm. Since learning ...
This paper presents the backpropagation algorithm based on an extended network approach in which the...
© 2018 IEEE. Artificial feedforward neural networks for simple objects recognition of different conf...
An activation function, possibly new, is proposed for use in digital simulation of arti#cial neural ...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
In the infancy of backpropagation [1, 2], the shape of the (dierentiable) activation function was in...
In this paper the effects of different activation functions on neural networks are argued
Thesis (M.Sc.)-University of Natal, Durban, 1992.Artificial neural networks (ANNs) were originally i...
This paper sums up the main contributions of the PhD Dissertation with an homonymous name to the cur...
Abstract-The Back-propagation (BP) training algorithm is a renowned representative of all iterative ...
Neural networks have shown tremendous growth in recent years to solve numerous problems. Various typ...
The performance of two algorithms may be compared using an asymptotic technique in algorithm analysi...
Traditional supervised neural network trainers have deviated little from the fundamental back propag...
In this paper the Sigma-if artificial neural network model is considered, which is a generalization ...
Activation functions are an essential part of artificial neural networks. Over the years, researches...
This paper presents some simple techniques to improve the backpropagation algorithm. Since learning ...
This paper presents the backpropagation algorithm based on an extended network approach in which the...
© 2018 IEEE. Artificial feedforward neural networks for simple objects recognition of different conf...
An activation function, possibly new, is proposed for use in digital simulation of arti#cial neural ...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
In the infancy of backpropagation [1, 2], the shape of the (dierentiable) activation function was in...
In this paper the effects of different activation functions on neural networks are argued
Thesis (M.Sc.)-University of Natal, Durban, 1992.Artificial neural networks (ANNs) were originally i...
This paper sums up the main contributions of the PhD Dissertation with an homonymous name to the cur...
Abstract-The Back-propagation (BP) training algorithm is a renowned representative of all iterative ...
Neural networks have shown tremendous growth in recent years to solve numerous problems. Various typ...
The performance of two algorithms may be compared using an asymptotic technique in algorithm analysi...
Traditional supervised neural network trainers have deviated little from the fundamental back propag...
In this paper the Sigma-if artificial neural network model is considered, which is a generalization ...