We investigate a novel neural network model which uses stochastic weights. It is shown that the functionality of the network is comparable to that of a general stochastic neural network using standard sigmoid activation functions. For the multilayer feedforward structure we demonstrate the network can be successfully used to solve a real problem like handwritten digit recognition. It is also shown that the recurrent network is as powerful as a Boltzmann machine. A new technique to implement simulated annealing is presented. Simulation results on the graph bisection problem demonstrate the model is efficient for global optimization
Complex diagnosis problems, defined by high-level models, often lead to constraint-based discrete op...
We present an approach to investigate the dependence of the capabilities of neural networks on their...
Most Artificial Neural Networks that are widely used today focus on approximating deterministic inpu...
Abstract. Artificial neural networks are brain-like models of parallel computations and cognitive ph...
Stochastic neural networks which are a type of recurrent neural networks can be basicly and simply ...
Artificial neural networks are brain-like models of parallel computations and cognitive phenomena. W...
The first purpose of this paper is to present a class of algorithms for finding the global minimum o...
This paper addresses the problem of neural computing by a fundamentally different approach to the on...
Many connectionist learning algorithms consists of minimizing a cost of the form C(w) = E(J(z; w)) ...
Introduction The work reported here began with the desire to find a network architecture that shared...
This article describes a novel neural stochastic model for solving graph problems. The neural system...
Artificial Neural Networks (ANNs) must be able to learn by experience from environment. This propert...
Artificial Neural Networks (ANNs) can be viewed as a mathematical model to simulate natural and biol...
The paper studies a stochastic extension of continuous recurrent neural networks and analyzes gradie...
A fundamental difficulty when using neural net-works applied to problems of pattern recognition is t...
Complex diagnosis problems, defined by high-level models, often lead to constraint-based discrete op...
We present an approach to investigate the dependence of the capabilities of neural networks on their...
Most Artificial Neural Networks that are widely used today focus on approximating deterministic inpu...
Abstract. Artificial neural networks are brain-like models of parallel computations and cognitive ph...
Stochastic neural networks which are a type of recurrent neural networks can be basicly and simply ...
Artificial neural networks are brain-like models of parallel computations and cognitive phenomena. W...
The first purpose of this paper is to present a class of algorithms for finding the global minimum o...
This paper addresses the problem of neural computing by a fundamentally different approach to the on...
Many connectionist learning algorithms consists of minimizing a cost of the form C(w) = E(J(z; w)) ...
Introduction The work reported here began with the desire to find a network architecture that shared...
This article describes a novel neural stochastic model for solving graph problems. The neural system...
Artificial Neural Networks (ANNs) must be able to learn by experience from environment. This propert...
Artificial Neural Networks (ANNs) can be viewed as a mathematical model to simulate natural and biol...
The paper studies a stochastic extension of continuous recurrent neural networks and analyzes gradie...
A fundamental difficulty when using neural net-works applied to problems of pattern recognition is t...
Complex diagnosis problems, defined by high-level models, often lead to constraint-based discrete op...
We present an approach to investigate the dependence of the capabilities of neural networks on their...
Most Artificial Neural Networks that are widely used today focus on approximating deterministic inpu...