Knowledge Distillation
knowledge distillation typically requires access to the target model, esp. the output probabilities, and the training data.
knowledge distillation typically requires access to the target model, esp. the output probabilities, and the training data.