33
/fr/
AIzaSyAYiBZKx7MnpbEhh9jyipgxe19OcubqV5w
August 1, 2025
Public Timelines
Menu
Public Timelines
FAQ
Public Timelines
FAQ
For education
For educational institutions
For teachers
For students
Cabinet
For educational institutions
For teachers
For students
Open cabinet
Créer
Close
Create a timeline
Public timelines
Library
FAQ
Télécharger
Export
Créer une copie
Premium
Intégrer dans le site Web
Share
knowledge transfer/ distillation
Category:
Autre
mise à jour avec succès:
30 août 2019
0
0
493
Auteurs
Created by
龚成
Attachments
Comments
Quantization timeline
By
龚成
29 mars 2020
1
0
738
Pruning
By
龚成
30 août 2019
1
0
606
decomposation
By
龚成
30 août 2019
0
0
414
New timeline
By
龚成
22 août 2019
0
0
380
Pruning Entropy
By
龚成
12 nov. 2019
0
0
336
New timeline
By
龚成
22 janv. 2020
0
0
262
Les événements
Geoffrey Hinton: Distilling the Knowledge in a Neural Network, arxiv, 2015
Cristian Buciluǎ: Model compression, SIGKDD, 2006
Jimmy Ba: Do Deep Nets Really eed to be Deep? , NIPS, 2014
Adriana Romero: FitNets: Hints for Thin Deep Nets, arxiv, 2015
Net2Net: Accelerating Learning via Knowledge Transfer, arxiv, 2016
Zhizhong Li: Learning without Forgetting, TPAMI, 2017
Heng Wang: Segmenting Neuronal Structure in 3D Optical Microscope Images via Knowledge Distillation with Teacher-Student Network, ISBI, 2019
LightweightNet, PR, 2019
Rohan Anil: Large scale distributed neural network training through online distillation, 2018
Tommaso Furlanello: Born Again Neural Networks, 2018
Ilija Radosavovic: Data Distillation: Towards Omni-Supervised Learning, CVPR, 2018
Nicholas Frosst: Distilling a Neural Network Into a Soft Decision Tree, 2017
Junho Yim: A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning, CVPR, 2017
About & Feedback
Un accord
Confidentialité
Bibliothèque
FAQ
Support 24/7
Cabinet
Get premium
Donate
The service accepts bank transfer (ACH, Wire) or cards (Visa, MasterCard, etc). Processed by Stripe.
Secured with SSL
Comments