Tags → #efficiency
-
Model Compression via Knowledge Distillation
How knowledge distillation compresses teacher models into compact students by transferring behavior and using tailored training objectives for efficient models.
How knowledge distillation compresses teacher models into compact students by transferring behavior and using tailored training objectives for efficient models.