SKKU-ESLAB / Auto-Compression

Automatic DNN compression tool with various model compression and neural architecture search techniques
MIT License
20 stars 20 forks source link

Add joint training and knowledge distillation #26

Closed sunghern closed 2 years ago

sunghern commented 2 years ago

Training single pruning rate can not cover multi pruning rate. This is not suitable to cover dynamic offloading.

We need to train jointly and use knowledge distillation