Abstract: Knowledge distillation has become a crucial technique for transferring intricate knowledge from a teacher model to a smaller student model. While logit-based knowledge distillation has shown ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results