-
Exit-Ensemble Distillation
This paper proposes a novel knowledge distillation-based learning method to improve the classification performance of convolutional neural networks (CNNs) without a pre-trained... -
Distilling Object Detectors with Feature Richness
The proposed Feature-Richness Score (FRS) method to choose important features that are beneficial to distillation. -
DETRDistill: A Universal Knowledge Distillation Framework for DETR-families
Transformer-based detectors (DETRs) are becoming popular for their simple framework, but the large model size and heavy time consumption hinder their deployment in the real world.