Distilling Object Detectors with Global Knowledge

Knowledge distillation learns a lightweight student model that mimics a cumbersome teacher. Existing methods regard the knowledge as the feature of each instance or their relations, which is the instance-level knowledge only from the teacher model, i.e., the local knowledge. However, the empirical studies show that the local knowledge is much noisy in object detection tasks, especially on the blurred, occluded, or small instances.

Data and Resources

Cite this as

Sanli Tang, Zhongyu Zhang, Zhanzhan Cheng, Jing Lu, Yunlu Xu, Yi Niu, Fan He (2024). Dataset: Distilling Object Detectors with Global Knowledge. https://doi.org/10.57702/mmvmwrhs

DOI retrieved: December 3, 2024

Additional Info

Field Value
Created December 3, 2024
Last update December 3, 2024
Author Sanli Tang
More Authors
Zhongyu Zhang
Zhanzhan Cheng
Jing Lu
Yunlu Xu
Yi Niu
Fan He
Homepage https://github.com/hikvision-research/DAVAR-Lab-ML