Deep neural networks, as a powerful system to represent high dimensional complex functions, play a key role in deep learning. Convergence of deep neural networks is a fundamental issue in building the mathematical foundation for deep learning. We investigated the convergence of deep ReLU networks and deep convolutional neural networks in two recent researches (arXiv:2107.12530, 2109.13542). Only the Rectified Linear Unit (ReLU) activation was studied therein, and the important pooling strategy was not considered. In this current work, we study the convergence of deep neural networks as the depth tends to infinity for two other important activation functions: the leaky ReLU and the sigmoid function. Pooling will also be studied. As a result,...
Deep neural networks (DNNs) have garnered significant attention in various fields of science and tec...
In a recent paper, Ling et al. investigated the over-parametrized Deep Equilibrium Model (DEQ) with ...
We contribute to a better understanding of the class of functions that is represented by a neural ne...
We explore convergence of deep neural networks with the popular ReLU activation function, as the dep...
Various powerful deep neural network architectures have made great contribution to the exciting succ...
Recently there has been much interest in understanding why deep neural networks are preferred to sha...
This article presents a new criterion for convergence of gradient descent to a global minimum. The c...
In this article we study fully-connected feedforward deep ReLU ANNs with an arbitrarily large number...
Recent work by Jacot et al. (2018) has shown that training a neural network using gradient descent i...
In the article, emphasis is put on the modern artificial neural network structure, which in the lite...
Advanced deep neural networks (DNNs), designed by either human or AutoML algorithms, are growing inc...
The generalization capabilities of deep neural networks are not well understood, and in particular, ...
We consider neural networks with rational activation functions. The choice of the nonlinear activati...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
This paper focuses on establishing $L^2$ approximation properties for deep ReLU convolutional neural...
Deep neural networks (DNNs) have garnered significant attention in various fields of science and tec...
In a recent paper, Ling et al. investigated the over-parametrized Deep Equilibrium Model (DEQ) with ...
We contribute to a better understanding of the class of functions that is represented by a neural ne...
We explore convergence of deep neural networks with the popular ReLU activation function, as the dep...
Various powerful deep neural network architectures have made great contribution to the exciting succ...
Recently there has been much interest in understanding why deep neural networks are preferred to sha...
This article presents a new criterion for convergence of gradient descent to a global minimum. The c...
In this article we study fully-connected feedforward deep ReLU ANNs with an arbitrarily large number...
Recent work by Jacot et al. (2018) has shown that training a neural network using gradient descent i...
In the article, emphasis is put on the modern artificial neural network structure, which in the lite...
Advanced deep neural networks (DNNs), designed by either human or AutoML algorithms, are growing inc...
The generalization capabilities of deep neural networks are not well understood, and in particular, ...
We consider neural networks with rational activation functions. The choice of the nonlinear activati...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
This paper focuses on establishing $L^2$ approximation properties for deep ReLU convolutional neural...
Deep neural networks (DNNs) have garnered significant attention in various fields of science and tec...
In a recent paper, Ling et al. investigated the over-parametrized Deep Equilibrium Model (DEQ) with ...
We contribute to a better understanding of the class of functions that is represented by a neural ne...