The hidden layer neurons in a multi-layered feed-forward neural network serve a critical role. From one perspective, the hidden layer neurons establish (linear) decision boundaries in the feature space. These linear decision boundaries are then combined by succeeding layers leading to convex-open and thereafter arbitrarily shaped decision boundaries. In this paper we show that the use of unidirectional Gaussian lateral connections from a hidden layer neuron to an adjacent hidden layer leads to a much richer class of decision boundaries. In particular the proposed class of networks has the advantage of sigmoidal feed-forward networks (global characteristics) but with the added flexibility of being able to represent local structure. An algori...
Up to now many neural network models have been proposed. In our study we focus on two kinds of feedf...
This study highlights on the subject of weight initialization in multi-layer feed-forward networks....
Abstract. Typically the response of a multilayered perceptron (MLP) network on points which are far ...
Abstract: In this paper, we provide a thorough analysis of decision boundaries of neural networks wh...
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
Artificial neural network models, particularly the perceptron and the backpropagation network, do no...
In his seminal paper Cover used geometrical arguments to compute the probability of separating two s...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
tput) is very commonly used to approximate unknown mappings. If the output layer is linear, such a n...
We present a general model for differentiable feed-forward neural networks. Its general mathematical...
We consider here how to separate multidimensional signals into two categories, such that the binary ...
This paper presents a novel approach for query based neural network learning. Consider a layered per...
We consider here how to separate multidimensional signals into two categories, such that the binary ...
The localized linear discriminant network (LLDN) has been designed to address classification problem...
Abstract:- In this paper, we discuss the applications of the up-down algorithm. We focus on how to a...
Up to now many neural network models have been proposed. In our study we focus on two kinds of feedf...
This study highlights on the subject of weight initialization in multi-layer feed-forward networks....
Abstract. Typically the response of a multilayered perceptron (MLP) network on points which are far ...
Abstract: In this paper, we provide a thorough analysis of decision boundaries of neural networks wh...
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
Artificial neural network models, particularly the perceptron and the backpropagation network, do no...
In his seminal paper Cover used geometrical arguments to compute the probability of separating two s...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
tput) is very commonly used to approximate unknown mappings. If the output layer is linear, such a n...
We present a general model for differentiable feed-forward neural networks. Its general mathematical...
We consider here how to separate multidimensional signals into two categories, such that the binary ...
This paper presents a novel approach for query based neural network learning. Consider a layered per...
We consider here how to separate multidimensional signals into two categories, such that the binary ...
The localized linear discriminant network (LLDN) has been designed to address classification problem...
Abstract:- In this paper, we discuss the applications of the up-down algorithm. We focus on how to a...
Up to now many neural network models have been proposed. In our study we focus on two kinds of feedf...
This study highlights on the subject of weight initialization in multi-layer feed-forward networks....
Abstract. Typically the response of a multilayered perceptron (MLP) network on points which are far ...