Recently, the teacher-student learning paradigm has drawn much attention in compressing neural networks on low-end edge devices, such as mobile phones and wearable watches. Current algorithms mainly assume the complete dataset for the teacher network is also available for the training of the student network. However, for real-world scenarios, users may only have access to part of training examples due to commercial profits or data privacy, and severe over-fitting issues would happen as a result. In this paper, we tackle the challenge of learning student networks with few data by investigating the ground-truth data-generating distribution underlying these few data. Taking Wasserstein distance as the measurement, we assume this ideal data dis...
In this paper, we propose a simple but effective method for training neural networks with a limited ...
Although deep neural networks have enjoyed remarkable success across a wide variety of tasks, their ...
The tremendous recent growth in the fields of artificial intelligence and machine learning has large...
The advancement of deep learning technology has been concentrating on deploying end-to-end solutions...
Effective methods for learning deep neural networks with fewer parameters are urgently required, sin...
Deep Neural Networks give state-of-art results in all computer vision applications. This comes with ...
Model compression has been widely adopted to obtain light-weighted deep neural networks. Most preval...
We focus on the problem of training a deep neural network in generations. The flowchart is that, in ...
This paper investigates techniques to transfer information between deep neural networks. We demonstr...
© 2017 ACM. Training thin deep networks following the student-teacher learning paradigm has received...
In this work, we try to answer the question: given a network of observations which are not independe...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
Knowledge distillation deals with the problem of training a smaller model (Student) from a high capa...
How to train an ideal teacher for knowledge distillation is still an open problem. It has been widel...
The remarkable successes of deep learning models\ud across various applications have resulted in the...
In this paper, we propose a simple but effective method for training neural networks with a limited ...
Although deep neural networks have enjoyed remarkable success across a wide variety of tasks, their ...
The tremendous recent growth in the fields of artificial intelligence and machine learning has large...
The advancement of deep learning technology has been concentrating on deploying end-to-end solutions...
Effective methods for learning deep neural networks with fewer parameters are urgently required, sin...
Deep Neural Networks give state-of-art results in all computer vision applications. This comes with ...
Model compression has been widely adopted to obtain light-weighted deep neural networks. Most preval...
We focus on the problem of training a deep neural network in generations. The flowchart is that, in ...
This paper investigates techniques to transfer information between deep neural networks. We demonstr...
© 2017 ACM. Training thin deep networks following the student-teacher learning paradigm has received...
In this work, we try to answer the question: given a network of observations which are not independe...
Deep neural networks achieve stellar generalisation on a variety of problems, despite often being la...
Knowledge distillation deals with the problem of training a smaller model (Student) from a high capa...
How to train an ideal teacher for knowledge distillation is still an open problem. It has been widel...
The remarkable successes of deep learning models\ud across various applications have resulted in the...
In this paper, we propose a simple but effective method for training neural networks with a limited ...
Although deep neural networks have enjoyed remarkable success across a wide variety of tasks, their ...
The tremendous recent growth in the fields of artificial intelligence and machine learning has large...