33
/it/
AIzaSyB4mHJ5NPEv-XzF7P6NDYXjlkCWaeKw5bc
November 1, 2025
Create a timeline
Public Timelines
For education
For educational institutions
For teachers
For students
Cabinet
For educational institutions
For teachers
For students
Open cabinet
FAQ
Ricevere il Premium
Close
Create a timeline
Public timelines
FAQ
About & Feedback
Accordo
Privatezza
FAQ
Support 24/7
Cabinet
Get premium
Donate
Scaricare
Export
Creare una copia
Premium
Integrare nel sito Web
Share
knowledge transfer/ distillation
Category:
Altro
è stato aggiornato:
30 ago 2019
0
0
544
Autori
Created by
龚成
Attachments
Comments
Quantization timeline
By
龚成
29 mar 2020
1
0
800
Pruning
By
龚成
30 ago 2019
1
0
660
decomposation
By
龚成
30 ago 2019
0
0
466
New timeline
By
龚成
22 ago 2019
0
0
435
Pruning Entropy
By
龚成
12 nov 2019
0
0
387
New timeline
By
龚成
22 gen 2020
0
0
309
Eventi
Geoffrey Hinton: Distilling the Knowledge in a Neural Network, arxiv, 2015
Cristian Buciluǎ: Model compression, SIGKDD, 2006
Jimmy Ba: Do Deep Nets Really eed to be Deep? , NIPS, 2014
Adriana Romero: FitNets: Hints for Thin Deep Nets, arxiv, 2015
Net2Net: Accelerating Learning via Knowledge Transfer, arxiv, 2016
Zhizhong Li: Learning without Forgetting, TPAMI, 2017
Heng Wang: Segmenting Neuronal Structure in 3D Optical Microscope Images via Knowledge Distillation with Teacher-Student Network, ISBI, 2019
LightweightNet, PR, 2019
Rohan Anil: Large scale distributed neural network training through online distillation, 2018
Tommaso Furlanello: Born Again Neural Networks, 2018
Ilija Radosavovic: Data Distillation: Towards Omni-Supervised Learning, CVPR, 2018
Nicholas Frosst: Distilling a Neural Network Into a Soft Decision Tree, 2017
Junho Yim: A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning, CVPR, 2017
About & Feedback
Accordo
Privatezza
FAQ
Support 24/7
Cabinet
Get premium
Donate
The service accepts bank transfer (ACH, Wire) or cards (Visa, MasterCard, etc). Processed by Stripe.
Secured with SSL
Comments