29
/de/
de
AIzaSyAYiBZKx7MnpbEhh9jyipgxe19OcubqV5w
April 1, 2024
Erstellen
Public Timelines
Bibliothek
FAQ
For education
Cabinet
For educational institutions
For teachers
For students/pupils
Herunterladen
Export
Eine Kopie erstellen
In der Webseite integrieren
Einsichten 363
0
0
knowledge transfer/ distillation
Wurde erstellt
龚成
⟶ Wurde aktualisiert 30 Aug 2019 ⟶
List of edits
Zeitlinien vom
龚成
:
29 Mär 2020
1
0
613
Quantization timeline
The quantization development history
30 Aug 2019
1
0
463
Pruning
30 Aug 2019
0
0
320
decomposation
22 Aug 2019
0
0
272
New timeline
12 Nov 2019
0
0
239
Pruning Entropy
22 Jan 2020
0
0
175
New timeline
Kommentare
Ereignisse
Geoffrey Hinton: Distilling the Knowledge in a Neural Network, arxiv, 2015
Cristian Buciluǎ: Model compression, SIGKDD, 2006
Jimmy Ba: Do Deep Nets Really eed to be Deep? , NIPS, 2014
Adriana Romero: FitNets: Hints for Thin Deep Nets, arxiv, 2015
Net2Net: Accelerating Learning via Knowledge Transfer, arxiv, 2016
Zhizhong Li: Learning without Forgetting, TPAMI, 2017
Heng Wang: Segmenting Neuronal Structure in 3D Optical Microscope Images via Knowledge Distillation with Teacher-Student Network, ISBI, 2019
LightweightNet, PR, 2019
Rohan Anil: Large scale distributed neural network training through online distillation, 2018
Tommaso Furlanello: Born Again Neural Networks, 2018
Ilija Radosavovic: Data Distillation: Towards Omni-Supervised Learning, CVPR, 2018
Nicholas Frosst: Distilling a Neural Network Into a Soft Decision Tree, 2017
Junho Yim: A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning, CVPR, 2017