Recursive branching network (RBN) was proposed in [1] to solve linearly non-separable problems using output-coded perceptrons. It relies on splitting the training patterns, at random, between parallel perceptrons. However, the random splitting mechanism can trap the perceptron in conflicting patterns. Optimized splitting methods are proposed here to insure meaningful way of splitting. We propose three splitting methods which use different similarity measures between patterns. We examine these methods on five standard data sets. In general, these methods enhance the performance of RBN and in many cases contribute to lowering the network complexity
Neural networks with random weights appear in a variety of machine learning applications, most promi...
Abstract. Radial Basis Neural (RBN) network has the power of the universal approximation function an...
The authors address the problem of choosing synaptic weights in a recursive (Hopfield) neural networ...
Recursive branching network (RBN) was proposed in [1] to solve linearly non-separable problems using...
The Recursive Deterministic Perceptron is a generalisation of the single layer perceptron neural net...
The recursive deterministic perceptron (RDP) is a generalization of the single layer perceptron neur...
The Recursive Deterministic Perceptron (RDP) feed-forward multilayer neural network is a generalisat...
This paper introduces a comparison study of three existing methods for building Recursive Determinis...
Recursive neural networks are a new connectionist model recently introduced for processing graphs. L...
AbstractThe Recursive Deterministic Perceptron (RDP) feedforward multilayer neural network is a gene...
Recursive neural networks are conceived for processing graphs and extend the well-known recurrent mo...
Recursive neural networks are a powerful tool for processing structured data, thus filling the gap b...
feed-forward multilayer neural network is a generalisation of the single layer perceptron topology. ...
The Recursive Deterministic Perceptron (RDP) feed-forward multilayer neural network is a generalisat...
A relaxed group-wise splitting method (RGSM) is developed and evaluated for channel pruning of deep ...
Neural networks with random weights appear in a variety of machine learning applications, most promi...
Abstract. Radial Basis Neural (RBN) network has the power of the universal approximation function an...
The authors address the problem of choosing synaptic weights in a recursive (Hopfield) neural networ...
Recursive branching network (RBN) was proposed in [1] to solve linearly non-separable problems using...
The Recursive Deterministic Perceptron is a generalisation of the single layer perceptron neural net...
The recursive deterministic perceptron (RDP) is a generalization of the single layer perceptron neur...
The Recursive Deterministic Perceptron (RDP) feed-forward multilayer neural network is a generalisat...
This paper introduces a comparison study of three existing methods for building Recursive Determinis...
Recursive neural networks are a new connectionist model recently introduced for processing graphs. L...
AbstractThe Recursive Deterministic Perceptron (RDP) feedforward multilayer neural network is a gene...
Recursive neural networks are conceived for processing graphs and extend the well-known recurrent mo...
Recursive neural networks are a powerful tool for processing structured data, thus filling the gap b...
feed-forward multilayer neural network is a generalisation of the single layer perceptron topology. ...
The Recursive Deterministic Perceptron (RDP) feed-forward multilayer neural network is a generalisat...
A relaxed group-wise splitting method (RGSM) is developed and evaluated for channel pruning of deep ...
Neural networks with random weights appear in a variety of machine learning applications, most promi...
Abstract. Radial Basis Neural (RBN) network has the power of the universal approximation function an...
The authors address the problem of choosing synaptic weights in a recursive (Hopfield) neural networ...