Many researchers implicitly assume that neural networks learn relations and generalise them to new unseen data. It has been shown recently, however, that the generalisation of feed-forward networks fails for identity relations.The proposed solution for this problem is to create an inductive bias with Differential Rectifier (DR) units. In this work we explore various factors in the neural network architecture and learning process whether they make a difference to the generalisation on equality detection of Neural Networks without and and with DR units in early and mid fusion architectures. We find in experiments with synthetic data effects of the number of hidden layers, the activation function and the data representation. The training se...
While modern deep neural architectures generalise well when test data is sampled from the same distr...
In an influential paper (“Rule Learning by Seven-Month-Old Infants”), Marcus, Vijayan, Rao and Visht...
We study learning and generalisation ability of a specific two-layer feed-forward neural network and...
Basic binary relations such as equality and inequality are fundamental to relational data structures...
In this paper, we show that standard feed-forward and recurrent neural networks fail to learn abstra...
Learning abstract and systematic relations has been an open issue in neural network learning for ove...
The ability to learn abstractions and generalise is seen as the essence of human intelligence.7 Sinc...
Deep neural networks have been widely used for various applications and have produced state-of-the-a...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
In an influential paper, Marcus et al. [1999] claimed that connectionist models cannot account for h...
In an influential paper (“Rule Learning by Seven-Month-Old Infants”), Marcus, Vijayan, Rao and Visht...
This research demonstrates a method of discriminating the numerical relationships of neural network ...
Despite enormous progress in machine learning, artificial neural networks still lag behind brains in...
This electronic version was submitted by the student author. The certified thesis is available in th...
In recent years Deep Neural Networks (DNNs) have achieved state-of-the-art results in many fields su...
While modern deep neural architectures generalise well when test data is sampled from the same distr...
In an influential paper (“Rule Learning by Seven-Month-Old Infants”), Marcus, Vijayan, Rao and Visht...
We study learning and generalisation ability of a specific two-layer feed-forward neural network and...
Basic binary relations such as equality and inequality are fundamental to relational data structures...
In this paper, we show that standard feed-forward and recurrent neural networks fail to learn abstra...
Learning abstract and systematic relations has been an open issue in neural network learning for ove...
The ability to learn abstractions and generalise is seen as the essence of human intelligence.7 Sinc...
Deep neural networks have been widely used for various applications and have produced state-of-the-a...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusThe ability o...
In an influential paper, Marcus et al. [1999] claimed that connectionist models cannot account for h...
In an influential paper (“Rule Learning by Seven-Month-Old Infants”), Marcus, Vijayan, Rao and Visht...
This research demonstrates a method of discriminating the numerical relationships of neural network ...
Despite enormous progress in machine learning, artificial neural networks still lag behind brains in...
This electronic version was submitted by the student author. The certified thesis is available in th...
In recent years Deep Neural Networks (DNNs) have achieved state-of-the-art results in many fields su...
While modern deep neural architectures generalise well when test data is sampled from the same distr...
In an influential paper (“Rule Learning by Seven-Month-Old Infants”), Marcus, Vijayan, Rao and Visht...
We study learning and generalisation ability of a specific two-layer feed-forward neural network and...