The standard training for deep neural networks relies on a global and fixed loss function. For more effective training, dynamic loss functions have been recently proposed. However, the dynamic global loss function is not flexible to differentially train layers in complex deep neural networks. In this paper, we propose a general framework that learns to adaptively train each layer of deep neural networks via meta-learning. Our framework leverages the local error signals from layers and identifies which layer needs to be trained more at every iteration. Also, the proposed method improves the local loss function with our minibatch-wise dropout and cross-validation loop to alleviate meta-overfitting. The experiments show that our method achieve...
There is an increasing number of pre-trained deep neural network models. However, it is still unclea...
In the past decade, neural networks have demonstrated impressive performance in supervised learning....
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practica...
This paper proposes a meta-learning approach to evolving a parametrized loss function, which is call...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
Deep convolutional neural networks (CNNs) trained with logistic and softmax losses have made signifi...
Deep neural networks (DNNs) for social image classification are prone to performance reduction and o...
Supervised training of deep neural nets typically relies on minimizing cross-entropy. However, in ma...
Training deep neural networks is inherently subject to the predefined and fixed loss functions durin...
The undeniable computational power of artificial neural networks has granted the scientific communit...
Recent innovations in training deep convolutional neural network (ConvNet) models have motivated the...
The training of deep neural networks utilizes the backpropagation algorithm which consists of the fo...
Imbalanced class distribution is an inherent problem in many real-world classification tasks where t...
When applying transfer learning for medical image analysis, downstream tasks often have significant ...
A natural progression in machine learning research is to automate and learn from data increasingly m...
There is an increasing number of pre-trained deep neural network models. However, it is still unclea...
In the past decade, neural networks have demonstrated impressive performance in supervised learning....
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practica...
This paper proposes a meta-learning approach to evolving a parametrized loss function, which is call...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
Deep convolutional neural networks (CNNs) trained with logistic and softmax losses have made signifi...
Deep neural networks (DNNs) for social image classification are prone to performance reduction and o...
Supervised training of deep neural nets typically relies on minimizing cross-entropy. However, in ma...
Training deep neural networks is inherently subject to the predefined and fixed loss functions durin...
The undeniable computational power of artificial neural networks has granted the scientific communit...
Recent innovations in training deep convolutional neural network (ConvNet) models have motivated the...
The training of deep neural networks utilizes the backpropagation algorithm which consists of the fo...
Imbalanced class distribution is an inherent problem in many real-world classification tasks where t...
When applying transfer learning for medical image analysis, downstream tasks often have significant ...
A natural progression in machine learning research is to automate and learn from data increasingly m...
There is an increasing number of pre-trained deep neural network models. However, it is still unclea...
In the past decade, neural networks have demonstrated impressive performance in supervised learning....
Recent years have witnessed the success of deep neural networks in dealing with a plenty of practica...