Deep neural networks have had tremendous success in a wide range of applications where they achieve state of the art performance. Their success can be generally attributed to three main pillars: their natural back-propagation structure which allows time and resources efficient gradient computation; recent advances in optimization theory which have led to the development of fast training algorithms; and availability of computationally efficient software (neural networks frameworks such as PyTorch and Tensorflow), hardware (Graphics Processing Units(GPUs), and more recently Tensor Processing Units(TPUs)). Deep neural networks are now the model of choice for many practitioners. As a result, there is a growing research interest in their theore...
Optimization is the key component of deep learning. Increasing depth, which is vital for reaching a...
Kernel methods and neural networks are two important schemes in the supervised learning field. The t...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
In the recent decade, deep neural networks have solved ever more complex tasks across many fronts in...
It took until the last decade to finally see a machine match human performance on essentially any ta...
The weight initialization and the activation function of deep neural networks have a crucial impact ...
Given any deep fully connected neural network, initialized with random Gaussian parameters, we bound...
This thesis characterizes the training process of deep neural networks. We are driven by two apparen...
The activation function deployed in a deep neural network has great influence on the performance of ...
This paper considers several aspects of random matrix universality in deep neural networks. Motivate...
Randomized Neural Networks explore the behavior of neural systems where the majority of connections ...
In the recent years, Deep Neural Networks (DNNs) have managed to succeed at tasks that previously ap...
Learning with neural networks depends on the particular parametrization of the functions represented...
We introduce a probability distribution, combined with an efficient sampling algorithm, for weights ...
Context of the tutorial: the IEEE CIS Summer School on Computational Intelligence and Applications (...
Optimization is the key component of deep learning. Increasing depth, which is vital for reaching a...
Kernel methods and neural networks are two important schemes in the supervised learning field. The t...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
In the recent decade, deep neural networks have solved ever more complex tasks across many fronts in...
It took until the last decade to finally see a machine match human performance on essentially any ta...
The weight initialization and the activation function of deep neural networks have a crucial impact ...
Given any deep fully connected neural network, initialized with random Gaussian parameters, we bound...
This thesis characterizes the training process of deep neural networks. We are driven by two apparen...
The activation function deployed in a deep neural network has great influence on the performance of ...
This paper considers several aspects of random matrix universality in deep neural networks. Motivate...
Randomized Neural Networks explore the behavior of neural systems where the majority of connections ...
In the recent years, Deep Neural Networks (DNNs) have managed to succeed at tasks that previously ap...
Learning with neural networks depends on the particular parametrization of the functions represented...
We introduce a probability distribution, combined with an efficient sampling algorithm, for weights ...
Context of the tutorial: the IEEE CIS Summer School on Computational Intelligence and Applications (...
Optimization is the key component of deep learning. Increasing depth, which is vital for reaching a...
Kernel methods and neural networks are two important schemes in the supervised learning field. The t...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...