ml:knowledge_distillation
This is an old revision of the document!
Table of Contents
Knowledge Distillation
Various papers related to distillation.
Overviews
- Section 4.2.2 of Iandola 2020
Papers
- Hinton et al 2015 - Distilling the Knowledge in a Neural Network (The paper that introduced knowledge distillation.)
- Kim & Rush 2016 - Sequence-Level Knowledge Distillation First paper applying knowledge distillation to seq2seq models.
ml/knowledge_distillation.1630089639.txt.gz · Last modified: 2023/06/15 07:36 (external edit)