-
ConceptDistil – 2-staged
ConceptDistil is a method to bring concept explanations to any black-box classifier using knowledge distillation. -
ConceptDistil – No gradient
ConceptDistil is a method to bring concept explanations to any black-box classifier using knowledge distillation. -
Teaching the machine to explain itself using domain knowledge
ConceptDistil is a method to bring concept explanations to any black-box classifier using knowledge distillation. -
ConceptDistil: Model-Agnostic Distillation of Concept Explanations
ConceptDistil is a method to bring concept explanations to any black-box classifier using knowledge distillation. -
Distilling Object Detectors with Global Knowledge
Knowledge distillation learns a lightweight student model that mimics a cumbersome teacher. Existing methods regard the knowledge as the feature of each instance or their... -
DINOv2: Learning robust visual features without supervision
The authors propose a method for self-supervised representation learning using knowledge distillation and vision transformers.