TO
torchdistill
Model Optimization·infrastructure·open·#632 of 944·+10·Surging
66.4
Low
High confidence
A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. ?26 knowledge distillation methods presented at TPAMI, CVPR, ICLR, ECCV, NeurIPS, ICCV, AAAI, etc ar
Pillar Breakdown
Adoption
35%
61.5
Maintenance
30%
75.6
Friction
20%
98.9
Ecosystem
15%
37.8
Momentum
1.00Surging
7d change +0.39
High confidenceIn Model Optimization
Ranked #10 of 16