Neural network modeling for small datasets can be justified from a theoretical point of view according to some of Bartlett's results showing that the generalization performance of a multilayer perceptron (MLP) depends more on the L-1 norm parallel to c parallel to(1) of the weights between the hidden layer and the output layer rather than on the total number of weights. In this article we investigate some geometrical properties of MLPs and drawing on linear projection theory, we propose an equivalent number of degrees of freedom to be used in neural model selection criteria like the Akaike information criterion and the Bayes information criterion and in the unbiased estimation of the error variance. This measure proves to be much smaller th...
summary:For general Bayes decision rules there are considered perceptron approximations based on suf...
This dissertation is designed to answer the following questions: (1) Which measurement model is bett...
Most application work within neural computing continues to employ multi-layer perceptrons (MLP). Tho...
Neural network modeling for small datasets can be justified from a theoretical point of view accordi...
Using richly parameterised models for small datasetscan be justified from a theoretical point of vie...
This thesis concerns the Multi-layer Perceptron (MLP) model, one of a variety of neural network mode...
The notion of equivalent number of degrees of freedom (e.d.f.) has been recently proposed in the con...
The notion of equivalent number of degrees of freedom (e.d.f.) has been recently proposed in the con...
The focus of this paper is on the neural network modelling approach that has gained increasing recog...
Feedforward neural networks trained by error backpropagation are examples of nonparametric regressio...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...
It took until the last decade to finally see a machine match human performance on essentially any ta...
Finding useful representations of data in order to facilitate scientific knowledge generation is a u...
. In this contribution we present results of using possibly inaccurate knowledge of model derivative...
This dissertation considers the subject of information losses arising from finite datasets used in t...
summary:For general Bayes decision rules there are considered perceptron approximations based on suf...
This dissertation is designed to answer the following questions: (1) Which measurement model is bett...
Most application work within neural computing continues to employ multi-layer perceptrons (MLP). Tho...
Neural network modeling for small datasets can be justified from a theoretical point of view accordi...
Using richly parameterised models for small datasetscan be justified from a theoretical point of vie...
This thesis concerns the Multi-layer Perceptron (MLP) model, one of a variety of neural network mode...
The notion of equivalent number of degrees of freedom (e.d.f.) has been recently proposed in the con...
The notion of equivalent number of degrees of freedom (e.d.f.) has been recently proposed in the con...
The focus of this paper is on the neural network modelling approach that has gained increasing recog...
Feedforward neural networks trained by error backpropagation are examples of nonparametric regressio...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...
It took until the last decade to finally see a machine match human performance on essentially any ta...
Finding useful representations of data in order to facilitate scientific knowledge generation is a u...
. In this contribution we present results of using possibly inaccurate knowledge of model derivative...
This dissertation considers the subject of information losses arising from finite datasets used in t...
summary:For general Bayes decision rules there are considered perceptron approximations based on suf...
This dissertation is designed to answer the following questions: (1) Which measurement model is bett...
Most application work within neural computing continues to employ multi-layer perceptrons (MLP). Tho...