International audienceKnowledge distillation aims at compressing deep models by transferring the learned knowledge from precise but cumbersome teacher models to compact student models. Due to the extreme imbalance between the foreground and the background of images, when traditional knowledge distillation methods are directly applied to the object detection task, there is a large performance gap between the teacher model and the student model. We tackle this imbalance problem from a sampling perspective, and we propose to include the teacher-student prediction disagreements into a feature-based detection distillation framework. This is done with PDF-Distil by dynamically generating a weighting mask applied to the knowledge distillation loss...
Knowledge distillation (KD) is a promising teacher-student learning paradigm that transfers informat...
In natural language processing (NLP) tasks, slow inference speed and huge footprints in GPU usage re...
A fundamental and challenging problem in deep learning is catastrophic forgetting, i.e., the tendenc...
International audienceKnowledge distillation aims at compressing deep models by transferring the lea...
Knowledge distillation(KD) is a widely-used technique to train compact models in object detection. H...
Knowledge distillation (KD) has shown very promising capabilities in transferring learning represent...
We propose a novel approach to efficiently select informative samples for large-scale learn-ing. Ins...
Unlike existing knowledge distillation methods focus on the baseline settings, where the teacher mod...
Continual learning (CL) is becoming increasingly important, not only for storage space because of th...
Knowledge Distillation (KD) is a well-known training paradigm in deep neural networks where knowledg...
There is a growing discrepancy in computer vision between large-scale models that achieve state-of-t...
Few-shot learning models learn representations with limited human annotations, and such a learning p...
We propose a method to infer a dense depth map from a single image, its calibration, and the associa...
We observed that in interactive text classification, user tends to point out only the misclassified ...
Previous knowledge distillation (KD) methods for object detection mostly focus on feature imitation ...
Knowledge distillation (KD) is a promising teacher-student learning paradigm that transfers informat...
In natural language processing (NLP) tasks, slow inference speed and huge footprints in GPU usage re...
A fundamental and challenging problem in deep learning is catastrophic forgetting, i.e., the tendenc...
International audienceKnowledge distillation aims at compressing deep models by transferring the lea...
Knowledge distillation(KD) is a widely-used technique to train compact models in object detection. H...
Knowledge distillation (KD) has shown very promising capabilities in transferring learning represent...
We propose a novel approach to efficiently select informative samples for large-scale learn-ing. Ins...
Unlike existing knowledge distillation methods focus on the baseline settings, where the teacher mod...
Continual learning (CL) is becoming increasingly important, not only for storage space because of th...
Knowledge Distillation (KD) is a well-known training paradigm in deep neural networks where knowledg...
There is a growing discrepancy in computer vision between large-scale models that achieve state-of-t...
Few-shot learning models learn representations with limited human annotations, and such a learning p...
We propose a method to infer a dense depth map from a single image, its calibration, and the associa...
We observed that in interactive text classification, user tends to point out only the misclassified ...
Previous knowledge distillation (KD) methods for object detection mostly focus on feature imitation ...
Knowledge distillation (KD) is a promising teacher-student learning paradigm that transfers informat...
In natural language processing (NLP) tasks, slow inference speed and huge footprints in GPU usage re...
A fundamental and challenging problem in deep learning is catastrophic forgetting, i.e., the tendenc...