You're currently viewing an old version of this dataset. To see the current version, click here.

DETRDistill: A Universal Knowledge Distillation Framework for DETR-families

Transformer-based detectors (DETRs) are becoming popular for their simple framework, but the large model size and heavy time consumption hinder their deployment in the real world.

Data and Resources

Cite this as

Jiahao Chang, Shuo Wang, Hai-Ming Xu, Zehui Chen, Chenhongyi Yang, Feng Zhao (2024). Dataset: DETRDistill: A Universal Knowledge Distillation Framework for DETR-families. https://doi.org/10.57702/fdmip84v

DOI retrieved: December 2, 2024

Additional Info

Field Value
Created December 2, 2024
Last update December 2, 2024
Author Jiahao Chang
More Authors
Shuo Wang
Hai-Ming Xu
Zehui Chen
Chenhongyi Yang
Feng Zhao
Homepage https://arxiv.org/abs/2207.13085