User Tools

Site Tools


ml:knowledge_distillation

Knowledge Distillation

Various papers related to distillation. From Iandola 2020: “While the term 'knowledge distillation' was coined by Hinton et al. 2015 to describe a specific method and equation, the term 'distillation' is now used in reference to a diverse range of approaches where a 'student' network is trained to replicate a 'teacher' network.”

Overviews

Papers

ml/knowledge_distillation.txt · Last modified: 2025/05/12 08:11 by jmflanig

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki