User Tools

Site Tools


ml:knowledge_distillation

This is an old revision of the document!


Knowledge Distillation

Various papers related to distillation. From Iandola 2020: “While the term 'knowledge distillation' was coined by Hinton et al. to describe a specific method and equation [40], the term 'distillation' is now used in reference to a diverse range of approaches where a “student” network is trained to replicate a 'teacher' network.”

Overviews

Papers

ml/knowledge_distillation.1659169746.txt.gz · Last modified: 2023/06/15 07:36 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki