National audienceRecently, deep neural networks have proven their ability to achieve excellent results on tasks such as classification and dimensionality reduction. The issue of hyper-parameter selection is decisive for these networks since the size of the search space increases exponentially with the number of layers. As a result, the grid-search approach is inappropriate and it is often left to the experimenter to ``guess'' sensible values for the hyper-parameters. In this study, we propose to select hyper-parameters layer after layer, on the basis of an unsupervised criterion, thus reducing to linear the complexity of the hyper-parameter selection procedure. Two unsupervised criteria are considered and the study focuses on determining an...
The Neural Tangent Kernel (NTK) has emerged as a powerful tool to provide memorization, optimization...
National audience<p>One common problem in building deep learning architectures is the choice of the ...
We propose an optimal architecture for deep neural networks of given size. The optimal architecture ...
National audienceRecently, deep neural networks have proven their ability to achieve excellent resul...
International audienceDeep Neural Networks (DNN) propose a new and efficient ML architecture based o...
Despite numerous successes in a wide range of industrial and scientific applications, the learning p...
International audienceSeveral recent advances to the state of the art in image classification benchm...
Several recent advances to the state of the art in image classification benchmarks have come from be...
The structure of a neural network determines to a large extent its cost of training and use, as well...
Deep neural networks with millions of parameters are at the heart of many state of the art computer ...
In the context of deep learning, the more expensive computational phase is the full training of the ...
In this paper, we propose a new automatic hyperparameter selection approach for determining the opti...
Convolutional neural networks (CNN) are special types of multi-layer artificial neural networks in w...
Multilayer neural networks were first proposed more than three decades ago, and various architecture...
Overfitting is one issue that deep learning faces in particular. It leads to highly accurate classif...
The Neural Tangent Kernel (NTK) has emerged as a powerful tool to provide memorization, optimization...
National audience<p>One common problem in building deep learning architectures is the choice of the ...
We propose an optimal architecture for deep neural networks of given size. The optimal architecture ...
National audienceRecently, deep neural networks have proven their ability to achieve excellent resul...
International audienceDeep Neural Networks (DNN) propose a new and efficient ML architecture based o...
Despite numerous successes in a wide range of industrial and scientific applications, the learning p...
International audienceSeveral recent advances to the state of the art in image classification benchm...
Several recent advances to the state of the art in image classification benchmarks have come from be...
The structure of a neural network determines to a large extent its cost of training and use, as well...
Deep neural networks with millions of parameters are at the heart of many state of the art computer ...
In the context of deep learning, the more expensive computational phase is the full training of the ...
In this paper, we propose a new automatic hyperparameter selection approach for determining the opti...
Convolutional neural networks (CNN) are special types of multi-layer artificial neural networks in w...
Multilayer neural networks were first proposed more than three decades ago, and various architecture...
Overfitting is one issue that deep learning faces in particular. It leads to highly accurate classif...
The Neural Tangent Kernel (NTK) has emerged as a powerful tool to provide memorization, optimization...
National audience<p>One common problem in building deep learning architectures is the choice of the ...
We propose an optimal architecture for deep neural networks of given size. The optimal architecture ...