33
/
AIzaSyAYiBZKx7MnpbEhh9jyipgxe19OcubqV5w
August 1, 2025
Public Timelines
Menu
Public Timelines
FAQ
Public Timelines
FAQ
For education
For educational institutions
For teachers
For students
Cabinet
For educational institutions
For teachers
For students
Open cabinet
Create
Close
Create a timeline
Public timelines
Library
FAQ
Download
Export
Duplicate
Premium
Embed
Share
knowledge transfer/ distillation
Category:
Other
Updated:
30 Aug 2019
0
0
490
Contributors
Created by
龚成
Attachments
Comments
Quantization timeline
By
龚成
29 Mar 2020
1
0
737
Pruning
By
龚成
30 Aug 2019
1
0
605
decomposation
By
龚成
30 Aug 2019
0
0
413
New timeline
By
龚成
22 Aug 2019
0
0
379
Pruning Entropy
By
龚成
12 Nov 2019
0
0
335
New timeline
By
龚成
22 Jan 2020
0
0
261
Events
Geoffrey Hinton: Distilling the Knowledge in a Neural Network, arxiv, 2015
Cristian Buciluǎ: Model compression, SIGKDD, 2006
Jimmy Ba: Do Deep Nets Really eed to be Deep? , NIPS, 2014
Adriana Romero: FitNets: Hints for Thin Deep Nets, arxiv, 2015
Net2Net: Accelerating Learning via Knowledge Transfer, arxiv, 2016
Zhizhong Li: Learning without Forgetting, TPAMI, 2017
Heng Wang: Segmenting Neuronal Structure in 3D Optical Microscope Images via Knowledge Distillation with Teacher-Student Network, ISBI, 2019
LightweightNet, PR, 2019
Rohan Anil: Large scale distributed neural network training through online distillation, 2018
Tommaso Furlanello: Born Again Neural Networks, 2018
Ilija Radosavovic: Data Distillation: Towards Omni-Supervised Learning, CVPR, 2018
Nicholas Frosst: Distilling a Neural Network Into a Soft Decision Tree, 2017
Junho Yim: A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning, CVPR, 2017
About & Feedback
Terms
Privacy
Library
FAQ
Support 24/7
Cabinet
Get premium
Donate
The service accepts bank transfer (ACH, Wire) or cards (Visa, MasterCard, etc). Processed by Stripe.
Secured with SSL
Comments