Hammer B. On the generalization ability of simple recurrent neural networks. Osnabrücker Schriften zur Mathematik. Osnabrück: Universität Osnabrück; 1997
Hammer B. Generalization of Elman Networks. In: Artificial Neural Networks - ICANN '97, 7th Interna...
A new approach to promote the generalization ability of neural networks is presented. It is based on...
Connectionist models of sentence processing must learn to behave systematically by generalizing from...
Hammer B. On the Generalization Ability of Recurrent Networks. In: Dorffner G, Bischof H, Hornik K, ...
Hammer B. On the Approximation Capability of Recurrent Neural Networks. In: Heiss M, ed. Proceedings...
Hammer B. On the approximation capability of recurrent neural networks. Neurocomputing. 2000;31(1-4)...
By making assumptions on the probability distribution of the potentials in a feed-forward neural net...
Tag der mündlichen Prüfung: One of the most important features of natural as well as artificial ne...
In this essay some fundamental results from the theory of machine learning and neural networks are p...
This thesis presents a new theory of generalization in neural network types of learning machines. Th...
We study learning and generalisation ability of a specific two-layer feed-forward neural network and...
AbstractWe consider the generalization error of concept learning when using a fixed Boolean function...
de Raedt L, Hammer B, Hitzler P, Maass W, eds. Recurrent Neural Networks - Models, Capacities, and A...
Proceeding of: International Conference Artificial Neural Networks — ICANN 2001. Vienna, Austria, Au...
Hammer B, Schrauwen B, Steil JJ. Recent advances in efficient learning of recurrent networks. In: Ve...
Hammer B. Generalization of Elman Networks. In: Artificial Neural Networks - ICANN '97, 7th Interna...
A new approach to promote the generalization ability of neural networks is presented. It is based on...
Connectionist models of sentence processing must learn to behave systematically by generalizing from...
Hammer B. On the Generalization Ability of Recurrent Networks. In: Dorffner G, Bischof H, Hornik K, ...
Hammer B. On the Approximation Capability of Recurrent Neural Networks. In: Heiss M, ed. Proceedings...
Hammer B. On the approximation capability of recurrent neural networks. Neurocomputing. 2000;31(1-4)...
By making assumptions on the probability distribution of the potentials in a feed-forward neural net...
Tag der mündlichen Prüfung: One of the most important features of natural as well as artificial ne...
In this essay some fundamental results from the theory of machine learning and neural networks are p...
This thesis presents a new theory of generalization in neural network types of learning machines. Th...
We study learning and generalisation ability of a specific two-layer feed-forward neural network and...
AbstractWe consider the generalization error of concept learning when using a fixed Boolean function...
de Raedt L, Hammer B, Hitzler P, Maass W, eds. Recurrent Neural Networks - Models, Capacities, and A...
Proceeding of: International Conference Artificial Neural Networks — ICANN 2001. Vienna, Austria, Au...
Hammer B, Schrauwen B, Steil JJ. Recent advances in efficient learning of recurrent networks. In: Ve...
Hammer B. Generalization of Elman Networks. In: Artificial Neural Networks - ICANN '97, 7th Interna...
A new approach to promote the generalization ability of neural networks is presented. It is based on...
Connectionist models of sentence processing must learn to behave systematically by generalizing from...