You're currently viewing an old version of this dataset. To see the current version, click here.

Distilling Object Detectors with Global Knowledge

Knowledge distillation learns a lightweight student model that mimics a cumbersome teacher. Existing methods regard the knowledge as the feature of each instance or their relations, which is the instance-level knowledge only from the teacher model, i.e., the local knowledge. However, the empirical studies show that the local knowledge is much noisy in object detection tasks, especially on the blurred, occluded, or small instances.

Data and Resources

This dataset has no data

Cite this as

Sanli Tang, Zhongyu Zhang, Zhanzhan Cheng, Jing Lu, Yunlu Xu, Yi Niu, Fan He (2024). Dataset: Distilling Object Detectors with Global Knowledge. https://doi.org/10.57702/mmvmwrhs

Private DOI This DOI is not yet resolvable.
It is available for use in manuscripts, and will be published when the Dataset is made public.

Additional Info

Field Value
Created December 3, 2024
Last update December 3, 2024
Author Sanli Tang
More Authors
Zhongyu Zhang
Zhanzhan Cheng
Jing Lu
Yunlu Xu
Yi Niu
Fan He
Homepage https://github.com/hikvision-research/DAVAR-Lab-ML