You're currently viewing an old version of this dataset. To see the current version, click here.

Self-distillation with Online Diffusion on Batch Manifolds Improves Deep Metric Learning

Deep metric learning (DML) methods typically leverage solely class labels to keep positive samples far away from negative ones. However, this type of method normally ignores the crucial knowledge hidden in the data (e.g., intra-class information variation), which is harmful to the generalization of the trained model. To alleviate this problem, in this paper we propose Online Batch Diffusion-based Self-Distillation (OBD-SD) for DML.

Data and Resources

Cite this as

Zelong Zeng, Hong Liu, Fan Yang, Shin'ichi Satoh (2024). Dataset: Self-distillation with Online Diffusion on Batch Manifolds Improves Deep Metric Learning. https://doi.org/10.57702/urle1zln

DOI retrieved: December 3, 2024

Additional Info

Field Value
Created December 3, 2024
Last update December 3, 2024
Defined In https://doi.org/10.48550/arXiv.2211.07566
Author Zelong Zeng
More Authors
Hong Liu
Fan Yang
Shin'ichi Satoh
Homepage https://github.com/ZelongZeng/OBD-SD_Pytorch