This paper focuses on the enhancement of the generalization ability and training stability of deep neural networks (DNNs). New activation functions that we call bounded rectified linear unit (ReLU), bounded leaky ReLU, and bounded bi-firing are proposed. These activation functions are defined based on the desired properties of the universal approximation theorem (UAT). An additional work on providing a new set of coefficient values for the scaled hyperbolic tangent function is also presented. These works result in improved classification performances and training stability in DNNs. Experimental works using the multilayer perceptron (MLP) and convolutional neural network (CNN) models have shown that the proposed activation functions outperfo...
Deep neural networks (DNNs) have garnered significant attention in various fields of science and tec...
In this paper we address the issue of output instability of deep neural networks: small perturbation...
The activation function deployed in a deep neural network has great influence on the performance of ...
This paper focuses on the enhancement of the generalization ability and training stability of deep n...
© 2017 IEEE. Deep Belief Network (DBN) is made up of stacked Restricted Boltzmann Machine layers ass...
© 2018 Curran Associates Inc..All rights reserved. Finding minimum distortion of adversarial example...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
Activation functions are essential for deep learning methods to learn and perform complex tasks such...
Activation functions provide deep neural networks the non-linearity that is necessary to learn compl...
Convolutional Neural Networks (CNN’s) have proven to be an effective approach for solving image cl...
The final publication is available at Elsevier via https://doi.org/10.1016/j.patcog.2019.07.006. © 2...
Researchers have proposed various activation functions. These activation functions help the deep net...
The activation function plays an important role in training and improving performance in deep neural...
Deep Learning in the field of Big Data has become essential for the analysis and perception of trend...
Deep feedforward neural networks with piecewise linear activations are currently producing the state...
Deep neural networks (DNNs) have garnered significant attention in various fields of science and tec...
In this paper we address the issue of output instability of deep neural networks: small perturbation...
The activation function deployed in a deep neural network has great influence on the performance of ...
This paper focuses on the enhancement of the generalization ability and training stability of deep n...
© 2017 IEEE. Deep Belief Network (DBN) is made up of stacked Restricted Boltzmann Machine layers ass...
© 2018 Curran Associates Inc..All rights reserved. Finding minimum distortion of adversarial example...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
Activation functions are essential for deep learning methods to learn and perform complex tasks such...
Activation functions provide deep neural networks the non-linearity that is necessary to learn compl...
Convolutional Neural Networks (CNN’s) have proven to be an effective approach for solving image cl...
The final publication is available at Elsevier via https://doi.org/10.1016/j.patcog.2019.07.006. © 2...
Researchers have proposed various activation functions. These activation functions help the deep net...
The activation function plays an important role in training and improving performance in deep neural...
Deep Learning in the field of Big Data has become essential for the analysis and perception of trend...
Deep feedforward neural networks with piecewise linear activations are currently producing the state...
Deep neural networks (DNNs) have garnered significant attention in various fields of science and tec...
In this paper we address the issue of output instability of deep neural networks: small perturbation...
The activation function deployed in a deep neural network has great influence on the performance of ...