Networks can be considered as approximation schemes. Multilayer networks of the backpropagation type can approximate arbitrarily well continuous functions (Cybenko, 1989; Funahashi, 1989; Stinchcombe and White, 1989). We prove that networks derived from regularization theory and including Radial Basis Functions (Poggio and Girosi, 1989), have a similar property. From the point of view of approximation theory, however, the property of approximating continuous functions arbitrarily well is not sufficient for characterizing good approximation schemes. More critical is the property of best approximation. The main result of this paper is that multilayer networks, of the type used in backpropagation, are not best approximation. For regularization...
We study the expressivity of deep neural networks. Measuring a network's complexity by its number of...
In this work we study and develop learning algorithms for networks based on regularization theory. I...
We study the expressivity of deep neural networks. Measuring a network's complexity by its number of...
Networks can be considered as approximation schemes. Multilayer networks of the backpropagation type...
We had previously shown that regularization principles lead to approximation schemes which are equiv...
We had previously shown that regularization principles lead to ap-proximation schemes that are equiv...
The theory developed in Poggio and Girosi (1989) shows the equivalence between regularization and ...
Abstract. We prove that neural networks with a single hidden layer are capable of providing an optim...
Neural Networks are widely noticed to provide a nonlinear function approximation method. In order to...
Learning an input-output mapping from a set of examples, of the type that many neural networks hav...
AbstractComplexity of Gaussian-radial-basis-function networks, with varying widths, is investigated....
This dissertation studies neural networks for pattern classification and universal approximation. Th...
National audienceWe study the expressivity of sparsely connected deep networks. Measuring a network'...
National audienceWe study the expressivity of sparsely connected deep networks. Measuring a network'...
National audienceWe study the expressivity of sparsely connected deep networks. Measuring a network'...
We study the expressivity of deep neural networks. Measuring a network's complexity by its number of...
In this work we study and develop learning algorithms for networks based on regularization theory. I...
We study the expressivity of deep neural networks. Measuring a network's complexity by its number of...
Networks can be considered as approximation schemes. Multilayer networks of the backpropagation type...
We had previously shown that regularization principles lead to approximation schemes which are equiv...
We had previously shown that regularization principles lead to ap-proximation schemes that are equiv...
The theory developed in Poggio and Girosi (1989) shows the equivalence between regularization and ...
Abstract. We prove that neural networks with a single hidden layer are capable of providing an optim...
Neural Networks are widely noticed to provide a nonlinear function approximation method. In order to...
Learning an input-output mapping from a set of examples, of the type that many neural networks hav...
AbstractComplexity of Gaussian-radial-basis-function networks, with varying widths, is investigated....
This dissertation studies neural networks for pattern classification and universal approximation. Th...
National audienceWe study the expressivity of sparsely connected deep networks. Measuring a network'...
National audienceWe study the expressivity of sparsely connected deep networks. Measuring a network'...
National audienceWe study the expressivity of sparsely connected deep networks. Measuring a network'...
We study the expressivity of deep neural networks. Measuring a network's complexity by its number of...
In this work we study and develop learning algorithms for networks based on regularization theory. I...
We study the expressivity of deep neural networks. Measuring a network's complexity by its number of...