We study on-line learning of a linearly separable rule with a simple perceptron. Training utilizes a sequence of uncorrelated, randomly drawn N-dimensional input examples. In the thermodynamic limit the generalization error after training such examples with P can be calculated exactly. For the standard perceptron algorithm it decreaes like (N/P)^1/3 for large P/N, in contrast to the faster (N/P)^1/2-behaviour of the so-called Hebbian learning. Furthermore, we show that a specific parameter-free on-line scheme, the AdaTron algorithm, gives an asymptotic (N/P)-decay of the generalization error. This coincides (up to a constant factor) with the bound for any training process based on random examples, including off-line learning. Simulations co...
We analyze the generalization ability of a simple perceptron acting on a structured input distributi...
We investigate the clipped Hebb rule for learning different multilayer networks of nonoverlapping pe...
tems, quantum biology, and relevant aspects of thermodynamics, information theory, cybernetics, and ...
We study the learning of a time-dependent linearly separable rule in a neural network. The rule is r...
On-line learning of a rule given by an N-dimensional Ising perceptron, is considered for the case wh...
haimCfiz.huji.ac.il The performance of on-line algorithms for learning dichotomies is studied. In on...
The performance of on-line algorithms for learning dichotomies is studied. In on-line learning, the ...
We analyse on-line learning of a linearly separable rule with a simple perceptron. Example inputs ar...
We analyse on-line learning of a linearly separable rule with a simple perceptron. Example inputs ar...
We study on-line gradient-descent learning in multilayer networks analytically and numerically. The ...
We solve the dynamics of on-line Hebbian learning in perceptrons exactly, for the regime where the s...
. -- We analyse online (gradient descent) learning of a rule from a finite set of training examples ...
Learning algorithms for perceptrons are deduced from statistical mechanics. Thermodynamical quantiti...
We study learning from single presentation of examples (incremental or on-line learning) in single-...
In this paper we show how to extract a hypothesis with small risk from the ensemble of hypotheses ge...
We analyze the generalization ability of a simple perceptron acting on a structured input distributi...
We investigate the clipped Hebb rule for learning different multilayer networks of nonoverlapping pe...
tems, quantum biology, and relevant aspects of thermodynamics, information theory, cybernetics, and ...
We study the learning of a time-dependent linearly separable rule in a neural network. The rule is r...
On-line learning of a rule given by an N-dimensional Ising perceptron, is considered for the case wh...
haimCfiz.huji.ac.il The performance of on-line algorithms for learning dichotomies is studied. In on...
The performance of on-line algorithms for learning dichotomies is studied. In on-line learning, the ...
We analyse on-line learning of a linearly separable rule with a simple perceptron. Example inputs ar...
We analyse on-line learning of a linearly separable rule with a simple perceptron. Example inputs ar...
We study on-line gradient-descent learning in multilayer networks analytically and numerically. The ...
We solve the dynamics of on-line Hebbian learning in perceptrons exactly, for the regime where the s...
. -- We analyse online (gradient descent) learning of a rule from a finite set of training examples ...
Learning algorithms for perceptrons are deduced from statistical mechanics. Thermodynamical quantiti...
We study learning from single presentation of examples (incremental or on-line learning) in single-...
In this paper we show how to extract a hypothesis with small risk from the ensemble of hypotheses ge...
We analyze the generalization ability of a simple perceptron acting on a structured input distributi...
We investigate the clipped Hebb rule for learning different multilayer networks of nonoverlapping pe...
tems, quantum biology, and relevant aspects of thermodynamics, information theory, cybernetics, and ...