Deep learning, the study of multi-layered artificial neural networks, has received tremendous attention over the course of the last few years. Neural networks are now able to outperform humans in a growing variety of tasks and increasingly have an impact on our day-to-day lives. There is a wide range of potential directions to advance deep learning, two of which we investigate in this thesis:(1) One of the key components of a network are its activation functions. The activations have a big impact on the overall mathematical form of the network. The \textit{first paper} studies generalisation of neural networks with rectified linear activations units (“ReLUs”). Such networks partition the input space into so-called linear regions, which are ...
In this paper the effects of different activation functions on neural networks are argued
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
Graph neural networks (GNNs) are a class of neural networks that allow to efficiently perform infere...
Neural networks have shown tremendous growth in recent years to solve numerous problems. Various typ...
Deep neural networks (DNN) have been successfully used in diverse emerging domains to solve real wor...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
Researchers have proposed various activation functions. These activation functions help the deep net...
Learning with neural networks depends on the particular parametrization of the functions represented...
The activation function deployed in a deep neural network has great influence on the performance of ...
In recent years, various deep neural networks with different learning paradigms have been widely emp...
Deep feedforward neural networks with piecewise linear activations are currently producing the state...
Deep Learning in the field of Big Data has become essential for the analysis and perception of trend...
In this study, we present deep neural networks with a set of node-wise varying activation functions....
We consider neural networks with rational activation functions. The choice of the nonlinear activati...
By applying concepts from the statistical physics of learning, we study layered neural networks of r...
In this paper the effects of different activation functions on neural networks are argued
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
Graph neural networks (GNNs) are a class of neural networks that allow to efficiently perform infere...
Neural networks have shown tremendous growth in recent years to solve numerous problems. Various typ...
Deep neural networks (DNN) have been successfully used in diverse emerging domains to solve real wor...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
Researchers have proposed various activation functions. These activation functions help the deep net...
Learning with neural networks depends on the particular parametrization of the functions represented...
The activation function deployed in a deep neural network has great influence on the performance of ...
In recent years, various deep neural networks with different learning paradigms have been widely emp...
Deep feedforward neural networks with piecewise linear activations are currently producing the state...
Deep Learning in the field of Big Data has become essential for the analysis and perception of trend...
In this study, we present deep neural networks with a set of node-wise varying activation functions....
We consider neural networks with rational activation functions. The choice of the nonlinear activati...
By applying concepts from the statistical physics of learning, we study layered neural networks of r...
In this paper the effects of different activation functions on neural networks are argued
This report introduces a novel algorithm to learn the width of non-linear activation functions (of a...
Graph neural networks (GNNs) are a class of neural networks that allow to efficiently perform infere...