Abstract. A universal binary neuron (UBN) operates with the complex-valued weights and the complex-valued activation function, which is the function of the argument of the weighted sum. This makes possible the implementation of the nonlinearly separable (non-threshold) Boolean functions on the single neuron. Hence the functionality of the UBN is incompatibly higher than the functionality of the traditional perceptron, because this neuron can implement the non-threshold Boolean functions. The UBN is closely connected with the discrete-valued multi-valued neuron (MVN). This is also a neuron with the complex-valued weights and the complex-valued activation function, which is the function of the argument of the weighted sum. A close relation of...
We consider the sample complexity of concept learning when we classify by using a fixed Boolean func...
An algorithm for the training of a special multilayered feed-forward neural network is presented. Th...
We show that neural networks with three-times continuously differentiable activation functions are c...
A universal binary neuron (UBN) operates with complex-valued weights and a complex-valued activation...
Abstract: In this paper, a new activation function for the multi-valued neuron (MVN) is presented. T...
Abstract:- Highly nonlinear data sets are important in the field of artificial neural networks. It i...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
In this paper ordered neural networks for the Nbit parity function containing [log2(N + 1)] threshol...
A new algorithm for learning representations in Boolean neural networks, where the inputs and output...
A basic neural model for Boolean computation is examined in the context of learning from examples. T...
A multilayer neural network based on multi-valued neurons is considered in the paper. A multivalued ...
This paper presents two models of complex-valued neurons (CVNs) for real-valued classification probl...
Multi-Valued Neuron (MVN) was proposed for pattern classification. It operates with complex-va-lued ...
In this paper, the ability of a Binary Neural Network comprising only neurons with zero thresholds a...
The paper will show that in order to obtain minimum size neural networks (i.e., size-optimal) for im...
We consider the sample complexity of concept learning when we classify by using a fixed Boolean func...
An algorithm for the training of a special multilayered feed-forward neural network is presented. Th...
We show that neural networks with three-times continuously differentiable activation functions are c...
A universal binary neuron (UBN) operates with complex-valued weights and a complex-valued activation...
Abstract: In this paper, a new activation function for the multi-valued neuron (MVN) is presented. T...
Abstract:- Highly nonlinear data sets are important in the field of artificial neural networks. It i...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
In this paper ordered neural networks for the Nbit parity function containing [log2(N + 1)] threshol...
A new algorithm for learning representations in Boolean neural networks, where the inputs and output...
A basic neural model for Boolean computation is examined in the context of learning from examples. T...
A multilayer neural network based on multi-valued neurons is considered in the paper. A multivalued ...
This paper presents two models of complex-valued neurons (CVNs) for real-valued classification probl...
Multi-Valued Neuron (MVN) was proposed for pattern classification. It operates with complex-va-lued ...
In this paper, the ability of a Binary Neural Network comprising only neurons with zero thresholds a...
The paper will show that in order to obtain minimum size neural networks (i.e., size-optimal) for im...
We consider the sample complexity of concept learning when we classify by using a fixed Boolean func...
An algorithm for the training of a special multilayered feed-forward neural network is presented. Th...
We show that neural networks with three-times continuously differentiable activation functions are c...