anonymous47823493 / EagleEye

(ECCV'2020 Oral)EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
304 stars 68 forks source link

Knowledge Distillation #76

Open khushboo-anand0909 opened 1 year ago

khushboo-anand0909 commented 1 year ago

Hello, I can see there is distiller folder which contains script for knowledge distillation. Can you please explain how to perform knowledge distillation with obtained pruned model? Thanks!