Black-box Few-shot Knowledge Distillation

Knowledge distillation with few samples and black-box teacher. The method uses MixUp and Conditional Variational Autoencoder (CVAE) to generate synthetic images to train the student network.

Data and Resources

Cite this as

Dang Nguyen, Sunil Gupta, Kien Do, Svetha Venkatesh (2024). Dataset: Black-box Few-shot Knowledge Distillation. https://doi.org/10.57702/2mvz6so7

DOI retrieved: December 2, 2024

Additional Info

Field Value
Created December 2, 2024
Last update December 2, 2024
Defined In https://doi.org/10.48550/arXiv.2207.12106
Author Dang Nguyen
More Authors
Sunil Gupta
Kien Do
Svetha Venkatesh
Homepage https://github.com/nphdang/FS-BBT