知识蒸馏论文

Romero, A.; Ballas, N.; Kahou, S. E.; Chassang, A.; Gatta, C.; and Bengio, Y. 2015. Fitnets: Hints for thin deep nets.
In International Conference on Learning Representations (ICLR).

A Comprehensive Overhaul of Feature Distillation 特征图小于0的部分不进行强监督

A Gift from Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning 提出FSP方法,用层与层的特征图之间的关系来蒸馏。

Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons 蒸馏对象是每层的切平面

PAYING MORE ATTENTION TO ATTENTION: IMPROVING THE PERFORMANCE OF CONVOLUTIONAL NEURAL NETWORKS VIA ATTENTION TRANSFER 通过Attention来蒸馏

Distilling the Knowledge in a Neural Network 蒸馏开山作品

Deep Mutual Learning 互相学习

Born Again Neural Networks 再生网络

FitNets: Hints for Thin Deep Nets 学生网络不仅仅拟合教师网络的soft-target,而且拟合隐藏层的输出(教师抽取的特征);

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容