33
/de/
AIzaSyAYiBZKx7MnpbEhh9jyipgxe19OcubqV5w
August 1, 2025
Public Timelines
Menu
Public Timelines
FAQ
Public Timelines
FAQ
For education
For educational institutions
For teachers
For students
Cabinet
For educational institutions
For teachers
For students
Open cabinet
Erstellen
Close
Create a timeline
Public timelines
Library
FAQ
Herunterladen
Export
Eine Kopie erstellen
Premium
In der Webseite integrieren
Share
knowledge transfer/ distillation
Category:
Andere
Wurde aktualisiert:
30 Aug 2019
0
0
492
Autoren
Created by
龚成
Attachments
Comments
Quantization timeline
By
龚成
29 Mär 2020
1
0
738
Pruning
By
龚成
30 Aug 2019
1
0
606
decomposation
By
龚成
30 Aug 2019
0
0
414
New timeline
By
龚成
22 Aug 2019
0
0
380
Pruning Entropy
By
龚成
12 Nov 2019
0
0
336
New timeline
By
龚成
22 Jan 2020
0
0
262
Ereignisse
Geoffrey Hinton: Distilling the Knowledge in a Neural Network, arxiv, 2015
Cristian Buciluǎ: Model compression, SIGKDD, 2006
Jimmy Ba: Do Deep Nets Really eed to be Deep? , NIPS, 2014
Adriana Romero: FitNets: Hints for Thin Deep Nets, arxiv, 2015
Net2Net: Accelerating Learning via Knowledge Transfer, arxiv, 2016
Zhizhong Li: Learning without Forgetting, TPAMI, 2017
Heng Wang: Segmenting Neuronal Structure in 3D Optical Microscope Images via Knowledge Distillation with Teacher-Student Network, ISBI, 2019
LightweightNet, PR, 2019
Rohan Anil: Large scale distributed neural network training through online distillation, 2018
Tommaso Furlanello: Born Again Neural Networks, 2018
Ilija Radosavovic: Data Distillation: Towards Omni-Supervised Learning, CVPR, 2018
Nicholas Frosst: Distilling a Neural Network Into a Soft Decision Tree, 2017
Junho Yim: A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning, CVPR, 2017
About & Feedback
Vereinbarung
Privatheit
Bibliothek
FAQ
Support 24/7
Cabinet
Get premium
Donate
The service accepts bank transfer (ACH, Wire) or cards (Visa, MasterCard, etc). Processed by Stripe.
Secured with SSL
Comments